A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.

Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.

  • NauticalNoodle
    link
    fedilink
    arrow-up
    4
    arrow-down
    1
    ·
    edit-2
    19 days ago

    So are you suggesting they can get an unaltered facial I.D. of the kids in the images? —Because that makes it regular csam with a specific victim (as mentioned), not an ai generative illustration.

    • emmy67@lemmy.world
      link
      fedilink
      arrow-up
      1
      arrow-down
      6
      ·
      19 days ago

      No, I am telling you csam images can’t be generated by an algorithm that hasn’t trained on csam

      • NauticalNoodle
        link
        fedilink
        arrow-up
        7
        arrow-down
        2
        ·
        edit-2
        19 days ago

        That’s patently false.

        I’m not going to continue to entertain this discussion but instead I’m just going to direct you to the multiple other people who have already effectively disproven this argument and similar arguments elsewhere in this post’s discusion. Enjoy.