A trial program conducted by Pornhub in collaboration with UK-based child protection organizations aimed to deter users from searching for child abuse material (CSAM) on its website. Whenever CSAM-related terms were searched, a warning message and a chatbot appeared, directing users to support services. The trial reported a significant reduction in CSAM searches and an increase in users seeking help. Despite some limitations in data and complexity, the chatbot showed promise in deterring illegal behavior online. While the trial has ended, the chatbot and warnings remain active on Pornhub’s UK site, with hopes for similar measures across other platforms to create a safer internet environment.

  • yamanii@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    9 months ago

    What do you mean soon, local models from civitai can generate CSAM for at least 2 years. I don’t think it’s possible to stop it unless the model creator does something to prevent it from generate naked people in general like the neutered SDXL.

    • ocassionallyaduck@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      9 months ago

      True. For obvious reasons I haven’t looked too deeply down that rabbit hole because RIP my search history, but I kind of assumed it would be soon. I’m thinking more specifically about models like SORA though. Where you could feed it enough input, then type a sentence to get video content. That is going to be a different level of darkness.