In 2023, more deepfake abuse videos were shared than in every other year in history combined, according to an analysis by independent researcher Genevieve Oh. What used to take skillful, tech-savvy experts hours to Photoshop can now be whipped up at a moment’s notice with the help of an app. Some deepfake websites even offer tutorials on how to create AI pornography.

What happens if we don’t get this under control? It will further blur the lines between what’s real and what’s not — as politics become more and more polarized. What will happen when voters can’t separate truth from lies? And what are the stakes? As we get closer to the presidential election, democracy itself could be at risk. And, as Ocasio-Cortez points out in our conversation, it’s about much more than imaginary images.

“It’s so important to me that people understand that this is not just a form of interpersonal violence, it’s not just about the harm that’s done to the victim,” she says about nonconsensual deepfake porn. She puts down her spoon and leans forward. “Because this technology threatens to do it at scale — this is about class subjugation. It’s a subjugation of entire people. And then when you do intersect that with abortion, when you do intersect that with debates over bodily autonomy, when you are able to actively subjugate all women in society on a scale of millions, at once digitally, it’s a direct connection [with] taking their rights away.”

  • Flying Squid@lemmy.world
    link
    fedilink
    arrow-up
    11
    arrow-down
    3
    ·
    9 months ago

    Unfortunately, I don’t know that there is much to be done at this point. Even if every form of deepfakery were outlawed in the U.S., people would just do it via another country that allows it. They could hide what they were doing with a VPN.

    The only way to even come close to truly combating this would be an international treaty. And even then, I think it’s highly unlikely to get all of the nations to sign on to it, so people would just do it via Belarus or something.

    Even detection tools will not do the trick, because, just like with malware, it will be a never-ending battle between detection tools and the deepfakes’ ability to avoid detection tools.

    At best, we can stop the wound from gushing so much.

    • umbrella
      link
      fedilink
      arrow-up
      5
      arrow-down
      1
      ·
      9 months ago

      yeah we should have taken it slow with these tools and understood them first. the cat is out of the bag now, i can only imagine the extent of political manipulation that will happen eventually.

    • magnusrufus@lemmy.world
      link
      fedilink
      arrow-up
      3
      arrow-down
      5
      ·
      9 months ago

      Treat the problem like cp. There is plenty to be done at this point. Not being able to fix it 100% doesn’t mean we shouldn’t try to fix it at all.

      • jkrtn
        link
        fedilink
        arrow-up
        7
        ·
        9 months ago

        If someone completely independently generates and distributes pornography that ends up looking too much like a real person, and someone else downloads and keeps that image, should the downloader be prosecuted? That’s what it’s going to come down to, I think. If you want a law that requires intent, it will be too difficult to prove, and if you want a law that does not require intent, it may be a big overreach.

        It’s easier to write the law for CSAM because you have to be pretty fucked in the head to want to look at that in the first place. Making possession of it illegal isn’t interfering with normal human activity.

        • magnusrufus@lemmy.world
          link
          fedilink
          arrow-up
          3
          ·
          9 months ago

          Those are pretty good concerns. Wonder if meta data about the model used and the prompt data could be required to address the issue of intent. I do think that accidental downloading would have to be an exception but if it’s clearly labelled/advertised I think that downloading could still be targeted.

          • jkrtn
            link
            fedilink
            arrow-up
            2
            arrow-down
            1
            ·
            9 months ago

            Sure, but then that’s a home run for every defense lawyer assigned to these. “Your honor, my client thought they were real photos she published,” becomes a legitimate defense. “My client didn’t realize a real person was involved at all, he thought the image was entirely fictitious.” People publishing AI fakes aren’t going to add exif data, and requiring meta data on AI generated images sounds like its own separate overreach.

            • magnusrufus@lemmy.world
              link
              fedilink
              arrow-up
              1
              arrow-down
              1
              ·
              9 months ago

              I’d say for the sake of not jailing the innocent that letting the ones that can plausibly deny beat the charges is what we’d have to accept. I think that requiring that meta data would certainly be a significant new requirement but it doesn’t necessarily make it an overreach. I believe that pornography production has to provide verification of the age of their participants and every porn site has some legal statement about the age of the people depicted. Categorize the meta data requirements similarly.

      • Flying Squid@lemmy.world
        link
        fedilink
        arrow-up
        5
        arrow-down
        2
        ·
        9 months ago

        You can’t treat it that way, because this is something that a complicit media is willing to share, and you cannot stop them from sharing it without going into major First Amendment violation territory.

        • magnusrufus@lemmy.world
          link
          fedilink
          arrow-up
          2
          arrow-down
          3
          ·
          9 months ago

          Sure you can. Cp is something that complicit people are still willing to share. Generating and distributing fake non consenting porn of people doesn’t need to be covered by the first amendment. Decide as a society that it’s fucking gross and should be illegal and then treat violations of the laws created against it harshly.

          • Flying Squid@lemmy.world
            link
            fedilink
            arrow-up
            4
            ·
            9 months ago

            Decide as a society that it’s fucking gross and should be illegal and then treat violations of the laws created against it harshly.

            That is not something that society has done in a long time. You are talking about something that was never legal in the first place vs. making something illegal that was already legal.

            • magnusrufus@lemmy.world
              link
              fedilink
              arrow-up
              2
              ·
              9 months ago

              Was cp never legal? Are you sure we haven’t made things illegal that we’re previously legal? People have this weirdly defeatist view about regulating ai deep fakes that doesn’t seem based on anything solid.

              • Flying Squid@lemmy.world
                link
                fedilink
                arrow-up
                3
                arrow-down
                1
                ·
                9 months ago

                Was cp never legal?

                No, it was never legal.

                Are you sure we haven’t made things illegal that we’re previously legal?

                Please re-read my post and do not put words in my mouth.

                People have this weirdly defeatist view about regulating ai deep fakes that doesn’t seem based on anything solid.

                I gave extremely solid reasoning. What have you said that is so solid?

                • magnusrufus@lemmy.world
                  link
                  fedilink
                  arrow-up
                  2
                  ·
                  9 months ago

                  Nah you haven’t really backed up with any solid reasoning. All laws have a date they were codified and enacted. Before that date the activities they covered were not yet illegal. Cp was at some point legal. Not long ago marital rape was perfectly legal. Now it’s not. Revenge porn laws are going into effect. You totally can take something awful that was legal and make it illegal. It might not be immediate, perfect, or without resistance but it can be done and has been done, even recently.

                  • Flying Squid@lemmy.world
                    link
                    fedilink
                    arrow-up
                    2
                    arrow-down
                    1
                    ·
                    9 months ago

                    It was never legal because all pornography was illegal first. And unless you want to go back to pre-Christian Rome, that’s how it’s been for centuries.

                    I’m still waiting for your solid reasoning, because so far, your ‘reasoning’ has been ‘you can totally do it.’