Article by IEEE Spectrum: The writers tested the two AI image generators MidJourney and Stable Diffusion, testing their abilities to generate imagery that closely resembled copyrighted material, which proves that the training data of the image generators had to contain copyrighted material. Implemented safeguards were largely unsuccessful to curb the output of potentially infringing images.

  • squirrel@lemmy.blahaj.zoneOPM
    link
    fedilink
    English
    arrow-up
    13
    ·
    9 months ago

    Because it’s a difference in scale: Sure, there are a handful of artists out there who can emulate a style/create a picture to the point where it is indistinguishable from somebody else’s copyrighted work. But the AI companies offer the tools for everyone to do that and they monetize this work of theirs (as in selling access to AI tools), while telling everybody else that rights holders should not be able to monetize their work. It’s a blatantly obvious double standard.

    • AnonStoleMyPants@sopuli.xyz
      link
      fedilink
      arrow-up
      4
      ·
      9 months ago

      Can you explain the double standard a bit more? I don’t understand it. Are you saying that the double standard is that AI companies sell a product that can be used to infringe copyright, yet they say that people infringing it using this sold product cannot monetize it?

      • squirrel@lemmy.blahaj.zoneOPM
        link
        fedilink
        English
        arrow-up
        7
        ·
        9 months ago

        What I mean is this: AI companies are arguing that people should not be able to earn money from the works they created (for example through selling licenses to their copyrighted works), insofar as paying for their training data is concerned. While - at the same time - they are charging money for the creation of works with AI.

        To put it differently: “Artists should not earn money from creation of artworks. We should earn money from creation of artworks.”