Thousands of authors demand payment from AI companies for use of copyrighted works::Thousands of published authors are requesting payment from tech companies for the use of their copyrighted works in training artificial intelligence tools, marking the latest intellectual property critique to target AI development.

  • FontMasterFlex@lemmy.world
    link
    fedilink
    English
    arrow-up
    29
    arrow-down
    8
    ·
    11 months ago

    So what’s the difference between a person reading their books and using the information within to write something and an ai doing it?

    • Saneless@lemmy.world
      link
      fedilink
      English
      arrow-up
      22
      arrow-down
      10
      ·
      11 months ago

      Because AIs aren’t inspired by anything and they don’t learn anything

        • dan@lemm.ee
          link
          fedilink
          English
          arrow-up
          13
          arrow-down
          5
          ·
          11 months ago

          No but a lazy copy of someone else’s work might be copyright infringement.

          • Odusei@lemmy.world
            link
            fedilink
            English
            arrow-up
            7
            arrow-down
            4
            ·
            11 months ago

            So when does Kevin Costner get to sue James Cameron for his lazy copy of Dances With Wolves?

            • dan@lemm.ee
              link
              fedilink
              English
              arrow-up
              6
              arrow-down
              3
              ·
              11 months ago

              Idk, maybe. There are thousands of copyright infringement lawsuits, sometimes they win.

              I don’t necessarily agree with how copyright law works, but that’s a different question. Doesn’t change the fact that sometimes you can successfully sue for copyright infringement if someone copies your stuff to make something new.

            • tenitchyfingers@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              11 months ago

              Why not? Hollywood is full to the brim with people suing for copyright infringement. And sometimes they win. Why should it be different for AI companies?

      • lily33@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        3
        ·
        edit-2
        11 months ago

        Language models actually do learn things in the sense that: the information encoded in the training model isn’t usually* taken directly from the training data; instead, it’s information that describes the training data, but is new. That’s why it can generate text that’s never appeared in the data.

        • the bigger models seem to remember some of the data and can reproduce it verbatim; but that’s not really the goal.
      • Chailles@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        11 months ago

        What does inspiration have to do with anything? And to be honest, humans being inspired has led to far more blatant copyright infringement.

        As for learning, they do learn. No different than us, except we learn silly abstractions to make sense of things while AI learns from trial and error. Ask any artist if they’ve ever looked at someone else’s work to figure out how to draw something, even if they’re not explicitly looking up a picture, if they’ve ever seen a depiction of it, they recall and use that. Why is it wrong if an AI does the same?

      • FontMasterFlex@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        3
        ·
        11 months ago

        not if i checked it out from a library. a WORLD of knowledge at your fingertips and it’s all free to me, the consumer. So who’s to say the people training the ai didn’t check it out from a library, or even buy the books they are using to train the ai with? would you feel better about it had they purchased their copy?

    • Melllvar@startrek.website
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      3
      ·
      edit-2
      11 months ago

      Large language models can only calculate the probability that words should go together based on existing texts.

      • mayo@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        11 months ago

        Isn’t this correct? What’s missing?

        Let’s ask chatGPT3.5:

        > Mostly accurate. Large language models like me can generate text based on patterns learned from existing texts, but we don’t “calculate probabilities” in the traditional sense. Instead, we use statistical methods to predict the likelihood of certain word sequences based on the training data.

          • BakonGuy@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            11 months ago

            I don’t see how “calculate the probability” and “predict the likelihood” are different. Seems perfectly accurate to me.

    • tenitchyfingers@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      4
      ·
      11 months ago

      A person is human and capable of artistry and creativity, computers aren’t. Even questioning this just means dehumanizing artists and art in general.

        • tenitchyfingers@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          11 months ago

          Do you think a hammer and a nail could do anything on their own, without a hand picking them up guiding them? Because that’s what a computer is. Nothing wrong with using a computer to paint or write or record songs or create something, but it has to be YOU creating it, using the machine as a tool. It’s also in the actual definition of the word: art is made by humans. Which explicitly excludes machines. Period. Like I’m fine with AI when it SUPPORTS an artist (although sometimes it’s an obstacle because sometimes I don’t want to be autocorrected, I want the thing I write to be written exactly as I wrote it, for whatever reason). But REPLACING an artist? Fuck no. There is no excuse for making a machine do the work and then to take the credit just to make a quick easy buck on the backs of actual artists who were used WITHOUT THEIR CONSENT to train a THING to replace them. Nah fuck off my guy. I can clearly see you never did anything creative in your whole life, otherwise you’d get it.