Which of the following sounds more reasonable?

  • I shouldn’t have to pay for the content that I use to tune my LLM model and algorithm.

  • We shouldn’t have to pay for the content we use to train and teach an AI.

By calling it AI, the corporations are able to advocate for a position that’s blatantly pro corporate and anti writer/artist, and trick people into supporting it under the guise of a technological development.

  • pensivepangolin@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    Now as I stated in my first comment in these threads, I don’t know terribly much about the technical details behind current LLM’s and I’m basing my comments on my layman’s reading.

    Could you elaborate on what you mean about the development of of deep learning architecture in recent years? I’m curious; I’m not trying to be argumentative.

    • Muehe
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Could you elaborate on what you mean about the development of deep learning architecture in recent years?

      Transformers. Fun fact, the T in GPT and BERT stands for “transformer”. They are a neural network architecture that was first proposed in 2017 (or 2014 depending on how you want to measure). Their key novelty is the method of implementing an attention mechanism and a context window without recursion, which was the method most earlier NNs used for that.

      The wiki page I linked above is admittedly a bit technical, this articles explanation might be a bit more friendly to the layperson.

      • pensivepangolin@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Thanks for the reading material: I’m really not familiar with Transformers other than the most basic info. I’ll give it a read when I get a break from work.