A bipartisan group of US senators introduced a bill Tuesday that would criminalize the spread of nonconsensual, sexualized images generated by artificial intelligence. The measure comes in direct response to the proliferation of pornographic AI-made images of Taylor Swift on X, formerly Twitter, in recent days.

The measure would allow victims depicted in nude or sexually explicit “digital forgeries” to seek a civil penalty against “individuals who produced or possessed the forgery with intent to distribute it” or anyone who received the material knowing it was not made with consent. Dick Durbin, the US Senate majority whip, and senators Lindsey Graham, Amy Klobuchar and Josh Hawley are behind the bill, known as the Disrupt Explicit Forged Images and Non-Consensual Edits Act of 2024, or the “Defiance Act.”

Archive

  • Serinus@lemmy.world
    link
    fedilink
    arrow-up
    11
    arrow-down
    2
    ·
    9 months ago

    I’d really prefer that people not send my parents any kind of porn.

    I look at it like someone took my face out of a Facebook picture, printed it, cut it out, pasted it over some porn, and did the same thing.

    It’d be a weird thing for them to do, but I don’t really need to send the law after them for it. Maybe for harassment?

    Laws have a cost, even good intentioned laws. I don’t believe we need new ones for this.

    • gapbetweenus@feddit.de
      link
      fedilink
      arrow-up
      4
      arrow-down
      2
      ·
      9 months ago

      Do you think people might change their opinion on you and act differently after seeing you performing in porn?

      Laws have a cost, even good intentioned laws.

      It causes distress to victims, arguably violates personal rights and is moral and ethically at least questionable. What would be downsides of criminal persecution for non-consensual sexual Deepfakes?

      • Serinus@lemmy.world
        link
        fedilink
        arrow-up
        5
        arrow-down
        1
        ·
        9 months ago

        If they understand that this kind of porn exists? No.

        But that’s an education thing, not a legal thing.

        The downside is giving law enforcement yet another excuse to violate digital privacy. Laws that are difficult/impossible to enforce tend to do more harm than good.

        I don’t see this law removing any fake Taylor Swift porn from the Internet. Or really any other celebrity, for that matter.

        • gapbetweenus@feddit.de
          link
          fedilink
          arrow-up
          1
          arrow-down
          4
          ·
          edit-2
          9 months ago

          If they understand that this kind of porn exists? No.

          You know people form opinions on actors based on their roles in movies? So people will change what they think of you and how they act towards you based on media, even if it’s clearly fictional.

          The downside is giving law enforcement yet another excuse to violate digital privacy. Laws that are difficult/impossible to enforce tend to do more harm than good.

          How exactly? Which new abilities to violate digital privacy is given the state by the this bill?

      • Montagge@kbin.social
        link
        fedilink
        arrow-up
        5
        arrow-down
        1
        ·
        9 months ago

        Yeah, but it’s happening to women mostly so these commenters probably don’t really care.

        • gapbetweenus@feddit.de
          link
          fedilink
          arrow-up
          3
          arrow-down
          1
          ·
          9 months ago

          I think a lot of man have unfortunately difficulties to empathize with women here, because they have rather different experience when it comes to expressing their sexuality and possible negative consequences.