• kromem@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    You think tracing for educational use which is then never distributed such that it could not have a negative impact on market value is infringement?

    What the generative AI field needs moving forward is a copyright discriminator that identifies infringing production of new images.

    But I’ll be surprised if cases claiming infringement on training in and of itself end up successful under current laws.

    And yeah, most of the discussion around this revolves around US laws. If we put aside any jurisdiction then there is no conversation to be had. Or we could choose arbitrary jurisdictions to support a position, for example Israel and Japan which have already said training is fair use.

    • Atemu
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      You think tracing for educational use which is then never distributed such that it could not have a negative impact on market value is infringement?

      That’s not what I think, that’s what the law says.

      I said what I think in the second paragraph. Sorry if I wasn’t being extra clear on that.

      What the generative AI field needs moving forward is a copyright discriminator that identifies infringing production of new images.

      Good luck with that.

      But I’ll be surprised if cases claiming infringement on training in and of itself end up successful under current laws.

      Depends. If the imitative AI imitates its source material too closely, that could absolutely be laid out as a distribution of copyrighted material.
      Think about it like this: If I distributed a tarball of copyrighted material, that would be infringement, eventhough you’d need tar to unpack it. Whether you need a transformer or tar to access the material should make no difference in my layman interpretation.

      • kromem@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        That’s not what I think, that’s what the law says.

        No, it doesn’t. The scenario outlined squarely falls under fair use, particularly because of the non-distribution combined with research/education use. Fair use is not infringement.

        Good luck with that.

        We’ll see.

        Depends. If the imitative AI imitates its source material too closely, that could absolutely be laid out as a distribution of copyrighted material.

        I mean, if we’re talking about hypothetical models that only produce infringing material, you might be right.

        But if we’re talking about current models that have no ability to reproduce the entire training set and only limited edge case reproducibility of training images with extensive prompt effort, I stand by being surprised (and that your tar metaphor is a poor and misleading one).

        If we’re going with poor metaphors, I could offer up the alternative of saying that distributing or offering a cloud based Photoshop isn’t infringement even though it can be used to reproduce copyrighted material. And much like diffusion based models and unlike a tarball, Photoshop requires creative input and effort from a user in order to produce infringing material.