A trial program conducted by Pornhub in collaboration with UK-based child protection organizations aimed to deter users from searching for child abuse material (CSAM) on its website. Whenever CSAM-related terms were searched, a warning message and a chatbot appeared, directing users to support services. The trial reported a significant reduction in CSAM searches and an increase in users seeking help. Despite some limitations in data and complexity, the chatbot showed promise in deterring illegal behavior online. While the trial has ended, the chatbot and warnings remain active on Pornhub’s UK site, with hopes for similar measures across other platforms to create a safer internet environment.

  • CameronDev@programming.dev
    link
    fedilink
    English
    arrow-up
    164
    arrow-down
    4
    ·
    10 months ago

    That kinda sounds reasonable. Especially if it can prevent someone going down that rabbithole? Good job PH.

  • FraidyBear@lemmy.world
    link
    fedilink
    English
    arrow-up
    124
    arrow-down
    3
    ·
    10 months ago

    Imagine a porn site telling you to seek help because you’re a filthy pervert. Thats gotta push some to get some help I’d think.

    • John_McMurray@lemmy.world
      link
      fedilink
      English
      arrow-up
      47
      arrow-down
      4
      ·
      edit-2
      10 months ago

      Imagine how dumb, in addition to deranged, these people would have to be to look for child porn on a basically legitimate website. Misleading headline too, it didn’t stop anything, it just told them “Not here”

      • abhibeckert@lemmy.world
        link
        fedilink
        English
        arrow-up
        19
        arrow-down
        1
        ·
        10 months ago

        We have culturally drawn a line in the sand where one side is legal and the other side of the line is illegal.

        Of course the real world isn’t like that - there’s a range of material available and a lot of it is pretty close to being abusive material, while still being perfectly legal because it falls on the right side of someone’s date of birth.

        It sounds like this initiative by Pornhub’s chatbot successfully pushes people away from borderline content… I’m not sure I buy that… but if it’s directing some of those users to support services then that’s a good thing. I worry though some people might instead be pushed over to the dark web.

        • John_McMurray@lemmy.world
          link
          fedilink
          English
          arrow-up
          16
          arrow-down
          4
          ·
          10 months ago

          Yeah…I forgot that the UK classifies some activities between consenting adults as “abusive”, and it seems some people are now using that definition in the real world.

          • Scirocco@lemm.ee
            link
            fedilink
            English
            arrow-up
            2
            ·
            10 months ago

            Facesitting porn (of adults) is illegal in UK for the reason that it’s potentially dangerous

            • Quicky@lemmy.world
              link
              fedilink
              English
              arrow-up
              4
              ·
              edit-2
              10 months ago

              Which led to some amazing protests.

              Weirdly, watching facesitting porn in the UK is perfectly fine, as long as it wasn’t filmed in the UK.

              I can just imagine trying to defend that in court. “Your honour, it’s clear to me that the muffled moans of the face-sittee are those of a Frenchman”

      • A_Random_Idiot@lemmy.world
        link
        fedilink
        English
        arrow-up
        14
        ·
        edit-2
        10 months ago

        I mean, is it dumb?

        Didnt pornhub face a massive lawsuit or something because of the amount of unmoderated child porn that was hidden in its bowels by uploaders (in addition to rape victims, revenge porn, etc etc…), to the point that they apparently only allow verified uploaders now and purged a huge swath of their videos?

      • theherk@lemmy.world
        link
        fedilink
        English
        arrow-up
        14
        arrow-down
        2
        ·
        10 months ago

        Until a few years ago, when they finally stopped allowing unmoderated, user uploaded content they had a ton a very problematic videos. And they were roasted about it in public for years. Including by many who were the unconsenting, sometimes underage subjects of these videos, and they did nothing. Good that they finally did, but they trained users for years that it was a place to find that content.

          • theherk@lemmy.world
            link
            fedilink
            English
            arrow-up
            9
            arrow-down
            8
            ·
            10 months ago

            You know you could easily say some dumb shit like that to somebody whose daughter wound up fighting a long time to remove herself from the site. ¯\(ツ)

              • r3df0x ✡️✝☪️@7.62x54r.ru
                link
                fedilink
                English
                arrow-up
                5
                arrow-down
                3
                ·
                10 months ago

                Pornhub left up underage child rape videos until they were very publicly called out for it.

                Porn is also a method of bourgeois oppression. The corporate elites want you to be an easily controlled consoomer.

              • theherk@lemmy.world
                link
                fedilink
                English
                arrow-up
                5
                arrow-down
                4
                ·
                10 months ago

                What did I say that was dumb? I said “until a few years ago”, and that is true. And I have firsthand experience with the trouble they wouldn’t go through to deal with it. To imply that I’m just choking down what the government is selling is simply not reasonable.

              • VirtualOdour@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                2
                ·
                10 months ago

                You’re wasting your time, they’re posting on lemmy where it’s not even possible up remove a picture you posted let alone one of you posted by someone else - the fact they’re still mad pornhub had a similar problem and solved it effectively makes it pretty obvious they’re looking for an excuse for an ideological crusade against people they’ve already decided to hate.

      • r3df0x ✡️✝☪️@7.62x54r.ru
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        4
        ·
        10 months ago

        Pornhub also knowingly hosted child porn. Ready or Not put them on blast for it when you raid a company called “Mindjot” for distributing child porn.

      • Gabu@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        10 months ago

        Didn’t they just block certain search terms (which actually made the site somewhat difficult to use for legitimate/legal content)?

  • Mostly_Gristle@lemmy.world
    link
    fedilink
    English
    arrow-up
    69
    arrow-down
    1
    ·
    10 months ago

    The headline is slightly misleading. 2.8 million searches were halted, but according to the article they didn’t attempt to figure out how many of those searches came from the same users. So thankfully the number of secret pedophiles in the UK is probably much lower than the headline might suggest.

        • Lemmy@lemm.ee
          link
          fedilink
          English
          arrow-up
          21
          ·
          edit-2
          10 months ago

          Same thing for me when I was 13. I freaked the fuck out when I saw a wikipedia article on the right. I thought I was going to jail the next day lmfao

      • Dran@lemmy.world
        link
        fedilink
        English
        arrow-up
        31
        ·
        10 months ago

        I’d think it’s probably not a majority, but I do wonder what percentage it actually is. I do have distinct memories of being like 12 and trying to find porn of people my own age instead of “gross old people” and being confused why I couldn’t find anything. Kids are stupid lol, that’s why laws protecting them need to exist.

        Also good god when I become a parent I am going to do proper network monitoring; in hindsight I should not have been left unattended on the internet at 12.

        • kylian0087@lemmy.world
          link
          fedilink
          English
          arrow-up
          14
          arrow-down
          1
          ·
          10 months ago

          I was the same back then. And have come across some stuff which is surprisingly easy to find. Later to realize how messed up that was.

          I think monitoring is good but it has a fine line not to cross in your child privacy. If they suspect anything they sure know how to work around it and you loose any insight.

        • Rinox@feddit.it
          link
          fedilink
          English
          arrow-up
          6
          ·
          edit-2
          10 months ago

          It’s not about laws, it’s about sexual education. Sexual education is a topic that can’t be left to the parents and should be explained in school, so as to give the kids a complete knowledge base.

          Most parents know about sex as much as they know about medicines. They’ve had some, but that doesn’t give them a degree for teaching that stuff.

        • Piece_Maker@feddit.uk
          link
          fedilink
          English
          arrow-up
          2
          ·
          10 months ago

          Sorry I know this is a serious subject and not a laughing matter but that’s a funny situation. I guess I was a MILF hunter at that age because even then I was perfectly happy to knock one out watching adult porn instead!

  • pHr34kY@lemmy.world
    link
    fedilink
    English
    arrow-up
    55
    arrow-down
    1
    ·
    10 months ago

    4.4 million sounds a bit excessive. Facebook marketplace intercepted my search for “unwanted gift” once and insisted I seek help. These things have a lot of false positives.

      • deur@feddit.nl
        link
        fedilink
        English
        arrow-up
        19
        arrow-down
        1
        ·
        edit-2
        10 months ago

        Probably just looking for deals on new stuff that people dont care about having been gifted.

        I could definitely see “unwanted gift” being a code word for trafficking :(

        • T156@lemmy.world
          link
          fedilink
          English
          arrow-up
          9
          ·
          edit-2
          10 months ago

          Not necessarily trafficking, but could be trafficking-adjacent.

          There used to be “child rehoming” ‘services’ on Facebook and the like, for people who regret adopting a kid, and pass them to others. Here’s a fairly in-depth article on the whole affair. Unsurprisingly, it didn’t go well.

          EDIT: In hindsight, “unwanted gift” could also be about people getting unexpectedly pregnant, and putting the resulting child up for adoption, but not wanting to go through legal means for one reason or another, which seems a more likely answer.

        • Pantherina@feddit.de
          link
          fedilink
          English
          arrow-up
          2
          ·
          10 months ago

          Lol makes sense. Meta being really meta here, but if thats needed… better too much than too little

        • michaelmrose@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          Do you really think human traffickers are listing people under secret codes on accounts obviously linked to their real identity with their real face? Remember the ikea thing where vendors who didn’t specify a price received an absurd default price for their goods eg 9999.99 and people that furniture that was listed at that price corresponded to kids being sold?

      • pHr34kY@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        10 months ago

        On Facebook marketplace just after Christmas? A potential bargain on unopened merch, of course!

  • Blackmist@feddit.uk
    link
    fedilink
    English
    arrow-up
    45
    arrow-down
    6
    ·
    10 months ago

    Did it? Or did it make them look elsewhere?

    The amount of school uniform, braces, pigtails and step-sister porn on Pornhub makes me think they want the nonces to watch.

      • michaelmrose@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        10 months ago

        Reasonable adults sites don’t return obviously sketchy things for reasonable queries. EG you don’t search boobs and get 12 year olds.

      • PM_Your_Nudes_Please@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        And what days were those? Cuz you pretty much need to go all the way back to pre-internet days. Hell, even that isn’t far enough, cuz Playboy’s youngest model was like 12 at one point.

        • The Snark Urge@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          Depressing, isn’t it? I was more talking about how prevalent “fauxcest” has become in porn more recently. I guess that’s just my cross to bear as an only child 💅

      • Blackmist@feddit.uk
        link
        fedilink
        English
        arrow-up
        13
        ·
        10 months ago

        I kind of want to trigger it to see what searches it reacts to, but at the same time I don’t want my IP address on a watchlist.

    • EdibleFriend@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      edit-2
      10 months ago

      given the amount of extremely edgy content already on Pornhub, this is kinda sus

      Yeah…i am honestly curious what these search terms were, how many of those were ACTUALLY looking for CP. And of those…how many are now flagged somewhow?

      • Arsonistic
        link
        fedilink
        English
        arrow-up
        2
        ·
        10 months ago

        I know I got the warning when I searched for young gymnast or something like that cuz I was trying to find a specific video I had seen before. False positives can be annoying, but that’s the only time I’ve ever encountered it.

      • The_wild_card
        link
        fedilink
        English
        arrow-up
        12
        arrow-down
        7
        ·
        10 months ago

        I thought porn industry was one of the worst to work at ? Or is this a holesome joke ?

      • The_wild_card
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        3
        ·
        edit-2
        10 months ago

        Yeah i agree i made another comment about it in this thread . But still they are helping people with mental issue so atleast a little more wholesome than before.

  • ocassionallyaduck@lemmy.world
    link
    fedilink
    English
    arrow-up
    31
    arrow-down
    7
    ·
    10 months ago

    This is one of the more horrifying features of the future of generative AI.

    There is literally no stopping it at this stage: AI generated CSAM will be possible soon thanks to systems like SORA.

    This is disgusting and awful. But one part of me hopes it can end the black market of real CSAM content forever. By flooding it with infinite fakes, users with that sickness can look at something that didn’t come from a real child’s suffering. It’s the darkest of silver linings I think, but I spoke with many sexual abuse survivors who feel the same about the loli hentai in Japan, in that it could be an outlet for these individuals instead of them finding their own.

    Dark topics. But I hope to see more actions like this in the future. If pedos can self isolate from IRL interactions and curb their ways with content that harms no one, then everyone wins.

    • gapbetweenus@feddit.de
      link
      fedilink
      English
      arrow-up
      34
      arrow-down
      1
      ·
      edit-2
      10 months ago

      The question is if consuming AI cp is helping to regulate the pedophiles behavior or if it’s enabling a progression of the condition. As far as I know that is an unanswered question.

        • gapbetweenus@feddit.de
          link
          fedilink
          English
          arrow-up
          14
          arrow-down
          3
          ·
          10 months ago

          For porn in general, yes - I think the data is rather clear. But for cp or related substitute content it’s not that definitive (to my knowledge), be it just for the reason that it’s really difficult to collect data on that sensitive topic.

          • Asafum@feddit.nl
            link
            fedilink
            English
            arrow-up
            10
            arrow-down
            1
            ·
            10 months ago

            Why would it be any different? If it’s about sexual gratification by their chosen media then I’d imagine it wouldn’t matter what the subject was, but obviously it’s always necessary to get actual data to know for sure.

            • Baahb@lemmy.world
              link
              fedilink
              English
              arrow-up
              7
              arrow-down
              3
              ·
              10 months ago

              I think you’re making assumptions that aren’t fair but maybe aren’t obvious either. Honestly I’m only thinking about this because I just watched the contrapoints video on twilight, and so I’ll use her example, though she’s talking about a slightly different topic. Gonna paraphrase like a mofo:

              Weird Power dynamics between partners in a fantasy, like twilight, or say porn since we are being obvious here, is normal because self image often requires women to present one way while hiding their desires for sex. It’s absolution of a sort, and is ostensibly healthy to explore in this way. That said… Some examples such as race play in fantasies may dehumanize the “other” in super not cool ways and reinforce negative racial stereotypes.

              If we take that and extend it to pedophiles, normalization of the thought process leading to that sort of disfunction/disorder seems like a not great thing, but yeah, we’d need to study to learn more and that seems both difficult and likely undesirable for the researchers.

            • gapbetweenus@feddit.de
              link
              fedilink
              English
              arrow-up
              6
              arrow-down
              5
              ·
              10 months ago

              Why would it be any different?

              Because pedophiles display pathological deviation when it comes to sexual attraction.

      • HonoraryMancunian@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        10 months ago

        Another question is, how will the authorities know the difference? An actual csam-haver can just claim it’s AI

    • yamanii@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      10 months ago

      What do you mean soon, local models from civitai can generate CSAM for at least 2 years. I don’t think it’s possible to stop it unless the model creator does something to prevent it from generate naked people in general like the neutered SDXL.

      • ocassionallyaduck@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        True. For obvious reasons I haven’t looked too deeply down that rabbit hole because RIP my search history, but I kind of assumed it would be soon. I’m thinking more specifically about models like SORA though. Where you could feed it enough input, then type a sentence to get video content. That is going to be a different level of darkness.

    • Zorque@kbin.social
      link
      fedilink
      arrow-up
      1
      ·
      10 months ago

      Are… we looking at the same article? This isn’t about AI generated CSAM, it’s about redirecting those who are searching for CSAM to support services.

      • ocassionallyaduck@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        Yes, but this is more about mitigating the spread of CSAM. And my feeling was it’s going to become somewhat impossible soon. AI generated porn is starting to flood the market and this chat it is also one of those “smart” attempts to mitigate this behavior. I’m saying that very soon, it will be something users don’t have to go anywhere to get if the model can just fabricate it out of thin air, so the chat it mitigation is only temporary, and the dark web of actual CSAM material will become overwhelmed and swamped in artificially generating new tidal waves of artificial CP. So it’s an alarming ethical dilemma we are on the horizon of that we need to think about.

      • ocassionallyaduck@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        So your takeaway is I’m… Against AI generative images and thus I “protest too much”

        I can’t tell if you’re pro AI and dislike me, or pro loli hentai and thus dislike.

        Dude, AI images and AI video are inevitable. To pretend that does have huge effects on society is stupid. It’s going to reshape all news media, very quickly. If reddit is 99% AI generated bot spam garbage with no verification of what is authentic, reddit is functionally dead, and we are on a train with no brakes in that direction for most public forums.

          • ocassionallyaduck@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            10 months ago

            You should probably research the phrase “protest too much” and the word “schtick” then.

            I’m not trying to clutch pearls here, as another poster here commented this isn’t a theoretical concern.

  • Kairos@lemmy.today
    link
    fedilink
    English
    arrow-up
    22
    ·
    10 months ago

    Oh just like an experiment the headline made me think someone was suing over this.

    • Gabu@lemmy.world
      link
      fedilink
      English
      arrow-up
      18
      ·
      10 months ago

      Not since the wipe, AFAIK. Still, at the bottom of the page you can (or at least could, haven’t used their services in a while) see a list of recent searches from all users, and you’d often find some disturbing shit.

    • BowtiesAreCool@lemmy.world
      link
      fedilink
      English
      arrow-up
      19
      arrow-down
      3
      ·
      10 months ago

      If you read the paragraph thats literally right there it says when certain terms were searched by the user.

      • KrankyKong@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        10 months ago

        …That paragraph doesn’t say anything about whether or not the material is on the site though. I had the same reaction as the other person, and I didn’t misread the paragraph that’s literally right there.

  • Kusimulkku@lemm.ee
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    1
    ·
    10 months ago

    I was wondering what sort of phrases get that notification but mentioning that mind be a bit counterproductive

    • Thorny_Insight@lemm.ee
      link
      fedilink
      English
      arrow-up
      8
      ·
      10 months ago

      I’m not sure if it’s related but as a life-long miniskirt lover I’ve noticed that many sites no longer return results for the term “schoolgirl” and instead you need to search for a “student”

    • Squire1039@lemm.eeOP
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      10 months ago

      The MLs have been shown to be extraordinarily good at statistically guessing your words. The words covered are probably comprehensive.

      • Kusimulkku@lemm.ee
        link
        fedilink
        English
        arrow-up
        13
        ·
        10 months ago

        I think the other article talks about it being a manually curated list because while ML can get correct words it also gets random stuff, so you need to check it isn’t making spurious connections. It’s pretty interesting how it all works

    • Bgugi@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      10 months ago

      Aylo maintains a list of more than 28,000 banned terms in multiple languages, which is constantly being updated.

      Id be very curious what these terms are, but I wouldn’t be surprised if “pizza guy” or “school uniform” would trigger a response.

  • n3uroh4lt
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    1
    ·
    10 months ago

    The original report from the researchers can be found here: https://www.iwf.org.uk/about-us/why-we-exist/our-research/rethink-chatbot-evaluation/ Researchers said:

    The chatbot was displayed 2.8 million times between March 2022 and August 2023, resulting in 1,656 requests for more information and Stop It Now services; and 490 click-throughs to the Stop It Now website.

    So from 4.4 million banned queries, only 2.8 million (between the date interval in the quote above) and only 490 clicks to seek help. Ngl, kinda underwhelming. And I also think, given the amount of extremely edgy content already on Pornhub, this is kinda sus.

    • laughterlaughter@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      10 months ago

      It’s not really that underwhelming. Disclaimer: I don’t condone child abuse. I find it abhorrent, and I will never justify it.

      People have fantasies, though. If a dude searches for “burglar breaks in and has sex with milf,” does that mean that he wants to do this in real life? Of course not (or god I hope not!) So, some people may have searched for “dad has sex with young babysitter” and bam! Bot! Some people have a fetish for diapers - there are tons of porn of adults wearing diapers and having sex. Not my thing, but who am I to judge? So again, someone searches “sex with diapers” and bam! Bot!

      Let’s not forget that as much as pornhub displays a sign saying “Hey, are you 18?” a lot of people will lie. And those young folks will also search for stupid things.

      So I don’t think that aaaaaall 1+ million searches were done by people with actual pedophilia.

      The fact that 1,600 people decided to click and inform themselves, in the UK alone, well, that’s a lot, in my opinion, and it should be something to commend, not to just say “eh. Underwhelming.”

  • Socsa@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    Google does this too, my wife was searching for “slutty schoolgirl” costumes and Google was like “have a seat ma’am”

    • prole@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      Google now gives you links to rehabs and addiction recovery centers when searching for harm reduction information about non-addictive drugs.

        • gapbetweenus@feddit.de
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          Sexuality is tightly connected to societal taboos, as long as everyone involved is a consenting adult - it’s no-one else businesses. There is no need or benefit in moralizing peoples sexuality.

          • r3df0x ✡️✝☪️@7.62x54r.ru
            link
            fedilink
            English
            arrow-up
            0
            arrow-down
            1
            ·
            10 months ago

            It’s still weird to sexualize children. It’s less weird when it’s teenagers and everyone is of age but it’s a weird thing to engage in constantly.

  • Gakomi@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    To be fair people are dumb as fuck, don’t search for illegal things on Google or any site that is well known cause that’s how you end up on some watch list.