A mother and her 14-year-old daughter are advocating for better protections for victims after AI-generated nude images of the teen and other female classmates were circulated at a high school in New Jersey.

Meanwhile, on the other side of the country, officials are investigating an incident involving a teenage boy who allegedly used artificial intelligence to create and distribute similar images of other students – also teen girls - that attend a high school in suburban Seattle, Washington.

The disturbing cases have put a spotlight yet again on explicit AI-generated material that overwhelmingly harms women and children and is booming online at an unprecedented rate. According to an analysis by independent researcher Genevieve Oh that was shared with The Associated Press, more than 143,000 new deepfake videos were posted online this year, which surpasses every other year combined.

  • Treczoks@lemm.ee
    link
    fedilink
    arrow-up
    8
    arrow-down
    20
    ·
    10 months ago

    You already need consent to take a persons picture. Did it help in this case? I don’t think so.

      • JonEFive@midwest.social
        link
        fedilink
        arrow-up
        13
        arrow-down
        1
        ·
        10 months ago

        *in the US.

        In the US, the thought is that if you are in a public place, you have no presumption of privacy. If you’re walking down the street, or shopping in a grocery store or whatever else, anyone can snap a picture of you.

        Other countries have different values and laws such that you may need a person’s permission to photograph them even if they are in a public place.

        • afraid_of_zombies@lemmy.world
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          10 months ago

          That thought is a pile of bull crap. If you really think you have zero presumption of privacy then I have the right to follow right behind you with a sign that says “idiot ahead”. Laws like this are so written for the drug war and for big media not for us.

          • JonEFive@midwest.social
            link
            fedilink
            arrow-up
            2
            ·
            10 months ago

            Not saying I agree with it, that’s just the way the laws are written.

            A good example of how crappy this law works out is paparazzi. They harass celebrities just to get any halfway decent photo. Then they can sell the photo, the celebrity has no say in the matter. And to make things even worse, if the celebrity happens to use the photo of themselves in any way, the photographer can demand payment because they own the copyright.

      • Treczoks@lemm.ee
        link
        fedilink
        arrow-up
        5
        arrow-down
        3
        ·
        10 months ago

        Sorry, I forgot that the US is decades behind the rest of the world in privacy laws.

        Well, maybe you could start with this aspect.

    • afraid_of_zombies@lemmy.world
      link
      fedilink
      arrow-up
      5
      arrow-down
      7
      ·
      10 months ago

      Really? Please show me the signed and notarized letter with the girl’s name on it that says they agree to have their image used for AI porn. Also since she is a minor her legal guardians.

      • CommanderCloon
        link
        fedilink
        arrow-up
        6
        arrow-down
        1
        ·
        10 months ago

        How would you possibly enforce that, or prevent people from just copying publicly available pictures for nefarious usage

        • afraid_of_zombies@lemmy.world
          link
          fedilink
          arrow-up
          1
          arrow-down
          3
          ·
          10 months ago

          It would have to be enforced after getting caught. As an add on charge. Like if an area has a rule against picking locks to commit a crime. You can never be charged with it alone but you can add that on to existing charges.