• Pooptimist@lemmy.world
    link
    fedilink
    English
    arrow-up
    126
    arrow-down
    28
    ·
    8 months ago

    Hear me out on this one:

    If we take it as given that pedophilia is a disorder and ultimatly a sickness, wouldn’t it be better that these people get their fix from AI created media than from the real thing?

    IMO there was no harm done to any kid in the creation of this and it would be better to give these people the fix they need or at least desperately desire in this way before they advance to more desperate and harmful measures.

    • DrPop@lemmy.one
      link
      fedilink
      English
      arrow-up
      80
      arrow-down
      2
      ·
      8 months ago

      You have a point, but in at least one of these cases the images used were of the girls around them and even tried extorting one is then. Issues like this should be handled on a case by case basis.

    • Queen HawlSera@lemm.ee
      link
      fedilink
      English
      arrow-up
      34
      arrow-down
      5
      ·
      8 months ago

      That’s basically how I feel. I’d much rather these kinds of people jack it to drawings and AI Generated images if the alternative is that they’re going to go after real chidlren.

      • Black_Gulaman@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        11
        arrow-down
        57
        ·
        8 months ago

        At some point the fake images won’t do it for them and then they’d fix their attention to real kids. We don’t want to wait for that to happen.

        It’s like using a drug with your threshold increasing each time you use, they’re will be a time that your old limit will have no effect on your satisfaction level.

          • MBM@lemmings.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            8 months ago

            None of us are specialists here, so people saying it is and people saying it isn’t harmless are both speculating

          • datavoid
            link
            fedilink
            English
            arrow-up
            9
            arrow-down
            10
            ·
            8 months ago

            Seems like speculation, but personally I’d be amazed if it were completely incorrect.

            If people who are attracted to children are constantly looking at CP, they are inevitably going to become more comfortable with it. Same with any other type of porn - do you think people who watch tons of torture porn dont become increasingly unaffected by it? It’s also the same for any other illegal or shocking content. I spent enough time on 4chan 10+ years ago to vouch for this personally.

            I’m not saying that everyone who looks at these AI images will act on their desire, but some people will absolutely end up wanting more after having open access to pictures of naked children.

            Honestly it’s a bit concerning how people are voting this down, why do we value the sexual gratification of pedos higher than the potential safety of children?

            • foggenbooty@lemmy.world
              link
              fedilink
              English
              arrow-up
              24
              arrow-down
              2
              ·
              8 months ago

              This is the War on Drugs argument all over again, except using porn instead of marijuana as a “gateway”.

              You’re correct that there can be some crossover and some unstable people could have an addiction that gets out of control, but I don’t think there’s any proof that happens in high enough numbers.

        • Neil
          link
          fedilink
          English
          arrow-up
          39
          arrow-down
          5
          ·
          8 months ago

          deleted by creator

        • papertowels@lemmy.one
          link
          fedilink
          English
          arrow-up
          30
          arrow-down
          2
          ·
          edit-2
          8 months ago

          By your logic, does everyone who’s into bdsm have a sex dungeon in their bedroom?

          Your comment reduces everyone to their base fetishes, as if that were the only thing enacting pressure on an individual to act, and I don’t believe that’s the case.

          • Queen HawlSera@lemm.ee
            link
            fedilink
            English
            arrow-up
            8
            ·
            8 months ago

            I’ll come right out and say it, I’m into inflation.

            The amount of times I’ve went out, bought a helium tank, and shoved a tube up anyone’s ass is just about equal to the amount of times I’ve been the Republican Candidate for the US Presidency… and I’m not even 35 yet.

            I think we all have weird kinks, it’s a part of the human experience.

            Heck imagine if we thought this way for EVERY sexual desire someone had.

            “Porn for people who prefer blondes? I dunno, what if they get carried away and start dying random brown haired people? The consequences are too great!”

            Sounds fucking ridiculous when you think of it that way.

        • Queen HawlSera@lemm.ee
          link
          fedilink
          English
          arrow-up
          13
          arrow-down
          1
          ·
          edit-2
          8 months ago

          Do you know how much porn there is of the My Little Pony characters? Tons

          Do you know how much of an epidemic there is of cartoon watchers going out and fucking ponies? Somewhere between null and zilch… Maybe one or two extreme cases, but that’s around the same amount of people who watch Super Hero movies and try to jump off the roof in order to fly.

          This is a slippery slope fallacy if I ever saw it.

          Heck, if anything we’ve seen that restrictions on porn actually leads to increased instances of sexual assault, in the same way a crackdown on drugs just leads to more deaths from overdoses.

          If letting some sicko have fake images of pretend children saves even one real child from being viciously exploited, I think it’s worth it.

          It’s not ideal and yeah, it makes the skin of any sane person crawl… Ideally we should be out curing pedophiles of their sexual urges entirely, but since we don’t have a way to do that why let perfect be the enemy of good? I mean what other ideas do we have? Cause “To Catch A Predator” may have been good television, but even that had ethical concerns ending in lawsuits lost and suicides performed, and castrated everyone convicted isn’t exactly 8th Amendment friendly… and even then that prevents repeat offenses, not initial offenses. (Prevention > Cure)

          Now all this aside, we do need to look at this on a case by case basis. If real children are being used to model for the AI or fake images are used as a form of blackmail (Think “Revenge Porn”, but way, way worse), then cuffs need to be slapped on people.

    • random65837@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      arrow-down
      2
      ·
      8 months ago

      In the US we ignore mental illness, make excuses for it, and then patiently wait until sometime terrible happens.

      • Queen HawlSera@lemm.ee
        link
        fedilink
        English
        arrow-up
        4
        ·
        8 months ago

        And when it does, can’t do anything about it “While making sure this never happens again is a noble goal, let’s not politicize this tragedy.”

        Or as they say over in Europe “Apparently the Americans say there’s no way to prevent that problem that literally doesn’t happen anywhere else in the world.”

    • hoshikarakitaridia@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 months ago

      Some of the comments here are so stupid: “either they unpedophile themselves or we just kill them for their thoughts”

      Ok so let me think this through. Sexual preferences in any way or pretty normal and they don’t go away. Actually if you tend to ignore them they become stronger. Also being a pedophile is not a crime currently. It’s the acting on it. So what happens right now is that people bottle it up, then it gets too much and they act on it in gruesome ways, because “if I go to prison I might as well make sure it was worth it”. Kids get hurt.

      “But we could make thinking about it illegal!” No we can’t. Say that’s a law, what now? If you don’t like someone, they’re a “pedophile”. Yay more false imprisonment. Also what happens to real pedophiles? Well they start commit more acts because theres punishment even for restraint. And the truth is a lot of ppl have pedophilic tendencies. You will not catch all of them. Things will just get worse.

      So why AI? Well as the commenter above me already said, if there’s no victim, there’s no problems. While that doesn’t make extortion legal (I mean obv. it’s a different law), this could make ppl with those urges have more restraint. We could even still limit it to specific sites and make it non-shareable. We’d have more control over it.

      I know ppl still want the easy solution which evidently doesn’t work, but imo this is a perfect solution.

      • Queen HawlSera@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 months ago

        I pretty much agree, while we should never treat Pedophilia as “Just another perfectly valid sexuality, let’s throw a parade, it’s nothing to be ashamed of” (Having the urge to prey on children is ABSOLUTELY something to be ashamed of even if you can’t control it.), we need to face facts… It isn’t someone waking up one day and saying “Wouldn’t it be funny if I took little Billy out back and filled him full of cock?”

        It’s something going on in their head, something chemical, some misfiring of the neurons, just the way their endocrine system is built.

        As much as I’d love to wave a magic wand over these people I reluctantly call people and cure them of their desires, we don’t have the power to do that. No amount of therapy in the world can change someone’s sexual tastes.

        So in lieu of an ideal solution, finding ways to prevent pedophiles from seeking victims in the first place is the next best thing.

        It’s not dissimilar to how when we set up centers for drug addicted people to get small doses of what they’re addicted to so that they can fight withdrawal symptoms, crimes and death rates go down. When you enact things like universal basic income and SNAP, people have less of a reason to rob banks and gas stations so we see less of them.

        It’s not enough to punish people who do something wrong, we need to find out why they’re doing it and eliminate the underlying cause.

      • captain_spork@startrek.website
        link
        fedilink
        English
        arrow-up
        61
        arrow-down
        2
        ·
        8 months ago

        That is not required. Especially in the larger models like a DALLE-3 it can combine concepts even without being directly trained on it. The one they had in the showcase for DALLE-2 was a chair shaped like an avocado. It knows what a chair is and it knows what an avocado is, so it can combine them. So it can know “this is what a naked human looks like” and “this is what a human child looks like” and could combine them without having ever seen CSAM.

          • lolcatnip@reddthat.com
            link
            fedilink
            English
            arrow-up
            23
            arrow-down
            1
            ·
            8 months ago

            “I don’t personally know what’s in the data set, so it must include CP” is a breathtakingly pathetic argument. Shame on you.

          • JokeDeity@lemm.ee
            link
            fedilink
            English
            arrow-up
            8
            arrow-down
            1
            ·
            8 months ago

            See, this is the problem with this entire thread. You guys are, rightfully, upset about what you perceive, but then you take that and use that energy to spread false claims because you think they’re true, or you think the lie will help bolster your side of the argument. Instead it just makes you look disingenuous and paints a bad look for everyone. There’s plenty of accurate points to be made for why this stuff isn’t okay, but you guys are making none of them, very matter-of-factly. Do better. Be better. Make valid arguments, based on fact, or do what people used to do and sit back and let the people who know what they’re talking about make the arguments.

    • eatthecake@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      20
      ·
      8 months ago

      I dont beleive its a sickness. Humans vary in innumerable ways and defining natural variations as sickness is a social decision, not medical. If you look at the DSM you will find that that social problems are sometimes given as a reason for defining something as illness. This is just the medicalisation of everything.
      Even if you grant that its a sicknesd, how does it follow that sickness should therefore be treated by AI? I see no argument or logic here. Do you think harm would be done if the paedophile knows the child? If the child finds out they are the object of rape fantasies? If you find you are married to a person who gets off on raping children? Your children?
      Do you allow for disgust and horror at sadistic desires or are we ‘not allowed to kink shame’.

            • PolarisFx@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              6
              arrow-down
              1
              ·
              8 months ago

              As someone who’s spent a couple weeks down a stable diffusion rabbit hole. I can attest that they don’t need to be trained on CP to generate CP content. Using some very popular checkpoints I inadvertently created some images that I found questionable enough to immediately delete. And I wasn’t even using prompts to generate young girls, with the right prompts I can easily see some of the more popular checkpoints pumping out CP.

            • SuddenlyBlowGreen@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              14
              ·
              8 months ago

              I think it it becomes widespread, like you want it to be, models that generate CSAM will be trained on such material, yes.

      • lolcatnip@reddthat.com
        link
        fedilink
        English
        arrow-up
        17
        arrow-down
        1
        ·
        8 months ago

        Not child porn. AI produces images all the time of things that aren’t in its training set. That’s kind of the point of it.

        • SuddenlyBlowGreen@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          17
          ·
          edit-2
          8 months ago

          AI produces images all the time of things that aren’t in its training set.

          AI models learn statistical connections from the data it’s provided. It’s going to see connections we can’t, but it’s not going to create things that are not connected to its training data. The closer the connection, the better the result.

          It’s a pretty easy conclusion from that that CSAM material will be used to train such models, and since training requires lots of data, and new data to create different and better models…

          • BetaDoggo_@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            1
            ·
            8 months ago

            Real material is being used to train some models, but sugesting that it will encourage the creation of more “data” is silly. The amount required to finetune a model is tiny compared to the amount that is already known to exist. Just like how regular models haven’t driven people to create even more data to train on.

            • SuddenlyBlowGreen@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              8 months ago

              Just like how regular models haven’t driven people to create even more data to train on.

              It has driven companies to try to get access to more data people generate to train the models on.

              Like chatGPT on copyrighted books, or google on emails, docs, etc.

              • BetaDoggo_@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                8 months ago

                And what does that have to do with the production of csam? In the example given the data already existed, they’ve just been more aggressive about collecting it.

                • SuddenlyBlowGreen@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  8 months ago

                  Well now in addition to regular pedos consuming CSAM, now there are the additional consumers of people to use huge datasets of them to train models.

                  If there is an increase in demand, the supply will increase as well.

    • treefrog@lemm.ee
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      27
      ·
      8 months ago

      Sex offenders aren’t allowed to watch porn at all in my state.

      Because science suggests watching porn, and getting your fix as you put it, through porn, encourages the behavior.

      Watching child porn teaches the mind to go to children to fulfill sexual urges. Mindfulness practice has been shown to be effective in curbing urges in all forms of addiction.

      So, no. Just no to your whole post.

      There’s effective treatment for addictions, rather sexual or otherwise. Rather the addiction feeds on children or heroin. And we don’t need to see if fake child porn helps. Evidence already suggests it doesn’t and we already have effective treatments that don’t put children at risk and that don’t encourage the behavior.

      • Lowlee Kun@feddit.de
        link
        fedilink
        English
        arrow-up
        17
        ·
        8 months ago

        Not judging/voting your comment, do you have the data at hand? Just out of interest.

        Some input though, you are not making a difference between offenders and non-offenders and i doubt there is even good data on non offenders to begin with.

      • Forbo
        link
        fedilink
        English
        arrow-up
        12
        arrow-down
        1
        ·
        8 months ago

        As mentioned on another one of your comments, I am having a hard time finding the science you reference.

      • Pooptimist@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        8 months ago

        This isn’t about addiction, it’s about sexuality. And you can’t just curb your whole sexuality away. These people have a disorder that makes them sexually attracted to children. At this point there is no harm done yet. They just are doomed to live a very unfulfilling life, because the people with whom they want to engage in sexual practices can’t give their consent, which is morally and legally required, no question about that. And most of them don’t give in to these urges and seek the help they need.

        But still, you can’t just meditate your whole sexuality away. I don’t want to assume, but I bet you also masturbate or pleasure yourself in one way or another, I know I do. And when I was young, fantasy was all I needed, but then I saw my first nude and watched my first porno and it progressed from there, and I’m sure fantasy won’t be enough for these people as well. So when they get to the stage where they want to consume media, I prefer it to be AI created images or some drawn hentai of a naked young girl or whatever, and not real abused children.

      • photonic_sorcerer@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        49
        arrow-down
        3
        ·
        8 months ago

        Bro I’ve watched a lot of regular porn and never once have I gone out and thought “why yes I’d sure like to rape that person”

        • Llamatron@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          19
          ·
          8 months ago

          But you will at least have an outlet if you get yourself a partner or hire an escort. There’s the prospect of sex in real life. You’re not forever limited to porn.

          • Lowlee Kun@feddit.de
            link
            fedilink
            English
            arrow-up
            20
            arrow-down
            1
            ·
            8 months ago

            I did not have sex in years, yet luckely nobody thinks i am a danger to women. It is nearly as if people do not suddenly feel the need to rape someone just because they dont have sex.

            • JokeDeity@lemm.ee
              link
              fedilink
              English
              arrow-up
              7
              arrow-down
              1
              ·
              8 months ago

              Slippery slope arguments almost exclusively come from the only people who they seem to affect. You see the same worrying mentality from religious people who tell you that without god they would be commiting serious crimes. Most people have inherent morality that these people seem to lack without strict legal or religious guidelines.

              • Lowlee Kun@feddit.de
                link
                fedilink
                English
                arrow-up
                7
                ·
                8 months ago

                I mean that makes sense i guess. I hate these “arguments” because it kills the debate. On the other hand i am totally not used to see any debate on this topic without it derailing into people calling each other pedos. So props to most people in here.

      • Sir_Kevin@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        45
        arrow-down
        7
        ·
        8 months ago

        By that logic almost everyone in Hollywood should be in prison for depicting violence, murder, rape etc in movies/shows etc. This argument was put to rest back in the '90s.

      • khalic@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        6
        ·
        8 months ago

        They have their little secret forums and brigade a lot, don’t trust the up/down votes

    • snownyte@kbin.social
      link
      fedilink
      arrow-up
      15
      arrow-down
      48
      ·
      8 months ago

      That’d be like giving an alcoholic a pint by the end of the week to reward their alcoholic behavior that they’d want out of.

      That’d be like giving money to a gambling addict as they promise to ‘pay you back’ for the loan you’ve given them.

      My point is, enabling people’s worst habits is always a bad idea.

      And how can you guarantee for certain that after awhile of these AI-generated CP crap, that they eventually wouldn’t want the real thing down the road and therefore, attempt crimes?

      Your solution is just dumb altogether.

      • AnonTwo@kbin.social
        link
        fedilink
        arrow-up
        35
        arrow-down
        3
        ·
        8 months ago

        …Aren’t drug patches already a thing for more extreme drugs? I feel like you just gave bad examples when there’s actual examples that exist…

        • snownyte@kbin.social
          link
          fedilink
          arrow-up
          1
          ·
          8 months ago

          I don’t claim to be an expert either, but it’s kind of a no-brainer to see what addiction is and what it does to people. Really simple stuff.

        • snownyte@kbin.social
          link
          fedilink
          arrow-up
          2
          ·
          8 months ago

          You really like spamming that “slippery slope” term, don’t you? It’s like your ultimate go-to for feeling like you’re superior. Just wait until you use it in a context where you’ll look like a dumbass, one of these days in where it doesn’t fit.

          • Lowlee Kun@feddit.de
            link
            fedilink
            English
            arrow-up
            1
            ·
            8 months ago

            If i use such an “argument” and someone calls me out on it i hope i take the critique to heart and think of an actual argument. Everyone looks like dumbass from time to time and so will i.

      • Kxpqzt@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 months ago

        No, it’s like flooding the rhino horn market with fake rhino horn. Literally.

    • PM_Your_Nudes_Please@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      45
      ·
      8 months ago

      While I don’t disagree with the initial premise, image AI requires training images.

      I suppose technically you could use hyper-realistic CGI CSAM, and then it could potentially be a “victimless” crime. But the chances of an AI being trained solely on CGI are basically non-existent. Photorealistic CGI is tough and takes a lot of time and skill to create from scratch. There are people whose entire careers are built upon photorealism, and their services aren’t cheap. And you’d probably need a team of artists (not just one artist, because the AI will inevitably end up learning whatever their “style” is and nothing more,) who are both capable and willing to create said images. The chances of all of those pieces falling into place are damned near 0.

      Maybe you could supplement the CGI with young-looking pornstar material? There are plenty of pornstars whose entire schtick is looking young. But they definitely don’t look like children because the proportions are obviously all wrong; Children have larger heads compared to their bodies, for example. That’s not something that an adult actress can emulate simply by being flat chested. So these supplemental images could just as easily end up polluting (for lack of a better word) your AI’s training, because it would just learn to spit out images of flat chested adult women.

      • db0@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        33
        arrow-down
        2
        ·
        8 months ago

        Generative Ai is perfectly capable of combining concepts. Teach it how do today do photorealistic underage and photorealistic porn and or can combine them together to make csam without ever being trained on actual csam

      • 3laws@lemmy.world
        link
        fedilink
        English
        arrow-up
        30
        arrow-down
        1
        ·
        8 months ago

        My autism can also be cured by d*ing… my ADHD can be fixed forever by the same thing. They come with intrusive thoughts, do you also want the final penalty for people like me?

        I’m not apologizing for people’s crimes or intentions of a crime at all, but your argument is complete bonkers if you want societies to just behave like that.

          • jinarched@lemm.ee
            link
            fedilink
            English
            arrow-up
            33
            arrow-down
            1
            ·
            8 months ago

            Pedophilia is a psychiatric disorder. Sexually abusing a child is a terrible crime.

            Lots of pedophiles will never touch a child. Alternatively, some people who are not pedophiles will sexually assault children (sometimes, it can be just a power dynamic thing for example).

            I think you are confusing the terms. Nobody here is defending people who assault children.

      • Lowlee Kun@feddit.de
        link
        fedilink
        English
        arrow-up
        26
        ·
        8 months ago

        Cool solution to kill people that did not offend. You sound like a real humanist. Do you by any chance run some for profit prison?

        • Mango@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          8 months ago

          Wrongthink. You are no longer allowed to feel this way under penalty of satisfying our bloodlust. /s

  • Surreal@programming.dev
    link
    fedilink
    English
    arrow-up
    50
    arrow-down
    1
    ·
    8 months ago

    If the man did not distribute the pictures, how did the government find out? Did a cloud service rat him out? Or spyware?

    • sudo22@lemmy.world
      link
      fedilink
      English
      arrow-up
      52
      arrow-down
      1
      ·
      8 months ago

      My guess would be he wasn’t self hosting the AI network so the requests were going through a website.

        • sudo22@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          8 months ago

          ChatGPT can be tricked into giving IED instructions if you ask the right way. So it could be a similar situation.

        • BreakDecks
          link
          fedilink
          English
          arrow-up
          2
          ·
          8 months ago

          Why should it have that? Stable Diffusion websites know that most of their users are interested in NSFW content. I think the idea is to turn GPUs into cash flow, not to make sure that it is all wholesome.

          I suppose they could get some kind of sex+children detector going for all generated image, but you’re going to have to train that model on something, so now it’s a chicken and egg problem.

    • photonic_sorcerer@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      6
      ·
      edit-2
      8 months ago

      He was found extorting little girls with nude pics he generated of them.

      Edit: So I guess he just generated them. In that case, how’d they become public? I guess this is the problem if you don’t read the article.

      • Missjdub@lemmy.world
        link
        fedilink
        English
        arrow-up
        35
        arrow-down
        1
        ·
        8 months ago

        Earlier this month, police in Spain launched an investigation after images of underage girls were altered with AI to remove their clothing and sent around town. In one case, a boy had tried to extort one of the girls using a manipulated image of her naked, the girl’s mother told the television channel Canal Extremadura.

        That was another case in Spain. Not the guy in Korea. The person in Korea didn’t distribute the images.

        • Mango@lemmy.world
          link
          fedilink
          English
          arrow-up
          14
          arrow-down
          1
          ·
          8 months ago

          I really gotta wonder what the difference is between prosecuting someone for their thoughts and prosecuting them for jerking it to their own artwork/generative whatever they kept entirely to themselves. The only bad I see here is someone having their privacy invaded by someone else bigger than them and being put on display for it. Sounds familiar?

        • Lowlee Kun@feddit.de
          link
          fedilink
          English
          arrow-up
          17
          ·
          8 months ago

          Because that was another case. Extortion and blackmail (and in this case would count as production of cp as would be the case if you would draw after a real child) are already illegal. On this case we simply dont have enough information.

  • papertowels@lemmy.one
    link
    fedilink
    English
    arrow-up
    40
    arrow-down
    4
    ·
    edit-2
    8 months ago

    So this does bring up an interesting point that I haven’t thought about - is it the depiction that matters, or is it the actual potential for victims that matters?

    Consider the Catholic schoolgirl trope - if someone of legal age is depicted as being much younger, should that be treated in the same way as this case? This case is arguing that the depiction is what matters, instead of who is actually harmed.

    • ilmagico@lemmy.world
      link
      fedilink
      English
      arrow-up
      25
      arrow-down
      4
      ·
      8 months ago

      Every country has different rules, standing on wikipedia.

      Personally, I feel that if making completely fictitious depictions of child porn, where no one is harmed (think AI-generated, or by consenting adults depicting minors) was legal, it might actually prevent the real, harmful ones from being made, thus preventing harm.

      • theangryseal@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        5
        ·
        8 months ago

        At the same time, an argument could be made that increasing the availability of such a thing could land it in the eyes of a person who otherwise wouldn’t have seen it in the first place and problems could develop.

        It could normalize something absurd and create more risks.

        I’m no expert and I’d rather leave it to people who thoroughly understand such behaviors to determine what is and isn’t ultimately more or less detrimental to the health of society.

        I just know how (anecdotally) pornography desensitizes a person until it makes more extreme things less bizarre and unnatural. I can’t help but imagine a teenager who would have otherwise developed a more healthy sexuality stumbling on images like that and becoming desensitized.

        It’s definitely something that needs some serious thought.

        • Jaxom_of_Ruatha@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          ·
          8 months ago

          “I’m no expert and I’d rather leave it to people who thoroughly understand such behaviors to determine what is and isn’t ultimately more or less detrimental to the health of society.”

          One of the big problems with addressing this problem is that NOBODY thoroughly understands these behaviors. They are so stigmatized that essentially nobody voluntarily admits to having pedophilic urges and scientists can only study those who actually act on them and harm children. They are almost certainly not a representative sample of the entire population of pedophiles, and this severely limits our ability to study the psychology of the population as a whole and what differentiates the rapists among them from the non-rapists.

          • BreakDecks
            link
            fedilink
            English
            arrow-up
            3
            ·
            8 months ago

            I think Japan would make a really good case study. Childlike aesthetics and behaviors are strongly sexualized in Japan. They also produce the most simulated CSAM per capita with few laws restricting production. Actual child pornography wasn’t made illegal until 1999. They still sell photo books of tweens in swimsuits and stuff in Japan. That, and lolicon, which is basically hentai with kids in it.

            There isn’t the same stigma against attraction to children, and we see that some 15-20% of the Japanese male population holds some aesthetic preferences that most westerners would consider pedophilic.

            I think we’d probably see similar numbers in America if we could cut though the stigma, which some people would panic over, but if anything we should be relieved that despite such numbers, actual sexual abuse of children is very rare.

            I mean, the writing is on the wall already. Nothing in the West is more sexualized than youth, we just like to pretend that 18 is some magical age where you looked completely different the day before your birthday or something, and ignore that puberty comes a lot earlier than that.

            What really matters is the social norms surrounding these things. We shouldn’t care if a 40 year old man thinks a 15 year old girl is attractive, we should care if he tries to do anything about that attraction, because the latter is a conscious choice that does harm, while the former is more complex matter of human sexual response.

        • BreakDecks
          link
          fedilink
          English
          arrow-up
          4
          ·
          8 months ago

          Most of what you’re repeating about porn “normalizing” things and “desensitizing” viewers is straight out of the puritan handbook. There is evidence that men who overconsume porn and don’t have a healthy sex life can fall into self-destructive patterns, but porn consumption doesn’t work like a drug. It’s not like the more you consume the more hardcore of content you desire, or that being exposed to certain types of porn will create new preferences that you wouldn’t otherwise have had. This is just long-standing anti-sex-work propaganda that tries to liken pornography to narcotics.

          People who consume CSAM are already into that kind of thing. Seeing CSAM isn’t going to turn anyone into a pedophile just as playing GTA isn’t going to turn anyone into a hardened street criminal. The goal should be to protect children, not to censor any content that sexualizes youth, because that really is a slippery slope. More on that here: https://nypost.com/2010/04/24/a-trial-star-is-porn/

        • ilmagico@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          8 months ago

          Yeah, valid points, but it’s not gonna be easy to tell, in practice. Doing a proper scientific test is likely going to be unethical for obvious reasons, so we’re left to wonder if the cons outweigh the pros or not.

      • papertowels@lemmy.one
        link
        fedilink
        English
        arrow-up
        3
        ·
        8 months ago

        Thanks for sharing that link. I hated reading through it, but it answered the question haha…

        I don’t really have strong feelings about it but I do think I lean towards agreeing with you.

    • BreakDecks
      link
      fedilink
      English
      arrow-up
      5
      ·
      8 months ago

      In America at least, people often confuse child pornography laws with obscenity laws, and they do end up missing the point. Obscenity laws are a violation of free speech, but that’s not what a CSAM ban is about. It’s about criminalizing the abuse of children as thoroughly as possible. Being in porn requires consent, and children can’t consent, so even the distribution or basic retention of this content violates a child’s rights.

      Which is why the courts have thrown out lolicon bans on First Amendment grounds every time it’s attempted. Simulated CSAM lacks a child whose rights could be violated, and generally meets all the the definitions of art, which would be protected expression no matter how offensive.

      It’s a sensitive subject that most people don’t see nuance in. It’s hard to admit that pedophilia isn’t a criminal act by itself, but only when an actual child is made a victim, or a conspiracy to victimize children is uncovered.

      With that said, we don’t have much of a description of the South Korean man’s offenses, and South Korea iirc has similar laws to the US on this matter. It is very possible that he was modifying real pictures of children with infill or training models using pictures of a real child to generate fake porn of a real child. This would introduce a real child as victim, so it’s my theory on what this guy was doing. Probably on a public image generator service that flagged his uploads.

    • eatthecake@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      17
      ·
      8 months ago

      The intent is to get off on fucking children, how you make that happen shouldnt matter

      • BreakDecks
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 months ago

        If we decide that nothing else matters but protecting children, then protecting children will be the only thing that matters anymore. That’s not a reasonable outcome.

  • JokeDeity@lemm.ee
    link
    fedilink
    English
    arrow-up
    29
    ·
    8 months ago

    Considering every other aspect of this is being argued in this thread to exhaustion, I just want to say it’s wild they caught him since it says he didn’t distribute it.

      • uis@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        arrow-down
        2
        ·
        edit-2
        8 months ago

        Is it Child Sexual Abuse Material if there are no children involved?

        “Anime should also be banned”, - - “All anime characters in anime should show passport with their date of birth”.

          • BreakDecks
            link
            fedilink
            English
            arrow-up
            4
            ·
            8 months ago

            I think you responded to the wrong comment, because while it is true that there are 2.4B people under the age of 18 alive today, it doesn’t appear to have any relevance to what you were replying to.

            • JoBo@feddit.uk
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              8 months ago

              Only if you assume that the only children harmed by CSAM are those used to produce CSAM.

              Consumers of CSAM are (actual or potential) perpetrators of abuse. Normalising it is not an option.

  • 0ddysseus@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    9
    ·
    8 months ago

    (Apologies if I use the wrong terminology here, I’m not an AI expert, just have a fact to share)

    The really fucked part is that at least google has scraped a whole lot of CSAM as well as things like ISIS execution bids etc and they have all this stuff stored and use it to do things like train the algorithms for AIs. They refuse to delete this material as they claim that they just find the stuff and aren’t responsible for what it is.

    Getting an AI image generator to produce CSAM means it knows what to show. So why is the individual in jail and not the tech bros?

    • diffuselight@lemmy.world
      link
      fedilink
      English
      arrow-up
      33
      arrow-down
      2
      ·
      edit-2
      8 months ago

      That’s a fundamental misunderstanding of how diffusion models work. These models extract concepts and can effortlessly combine them to new images.

      If it learns woman + crown = queen

      and queen - woman + man = king

      it is able to combine any such concept together

      As Stability has noted. any model that has the concept of naked and the concept of child in it can be used like this. They tried to remove naked for Stable Diffusion 2 and nobody used it.

      Nobody trained these models on CSAM and the problem is a dilemma in the same way a knife is a dilemma. We all know a malicious person can use a knife for murder, including of children Yet society has decided that knives sufficient other uses that we still allow their sale pretty much everywhere.

      • grepe@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        8 months ago

        This can be used by pedophiles is used as an argument to ban cryptography… I wonder if someone will apply that to the generative AI.

        • piecat@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          8 months ago

          Depends how profitable it is.

          If it can replace workers no, if it threatens the big players like Disney yes.

      • 0ddysseus@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        5
        ·
        edit-2
        5 months ago

        Editing this reply to say that I was in fact right and I did not have any fundamental misunderstanding of anything. And the database in question here is called LAIOn and contains 6 billions images scraped from the web, including CSAM images.

        Thanks for that. As I said, I’m not big into how AI works, so not surprised I got that wrong. The databases of everything that has come across the clear web are still there though and are available for use by people with access.

    • mcgravier@kbin.social
      link
      fedilink
      arrow-up
      2
      ·
      8 months ago

      Getting an AI image generator to produce CSAM means it knows what to show

      Not necessarily. Part of AI is blending different concepts. AI trained on images of regular children and nude adults in principle should be able to produce underage nudity. This is a side effect of the intelligence in the AI

  • Jessica@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    50
    arrow-down
    47
    ·
    edit-2
    8 months ago

    My god there are way too many comments in here trying to normalize pedophilia. Disgusting. Pathetic.

    These are people that need serious psychiatric care, not acceptance or to be included in the LGBTQ+ community. There is absolutely nothing to compare between them and any group within the LGBTQ+ community. Nothing.

    Combatting CP is a hard enough task for the poor bastards that have to do it. There does not need to be AI produced images in the mix.

    Lemmy, do better.

    • dsemy@lemm.ee
      link
      fedilink
      English
      arrow-up
      46
      arrow-down
      8
      ·
      edit-2
      8 months ago

      I think pedophiles should be treated with compassion, as being a pedophile doesn’t make someone a sexual predator.

      IMO the stigma against pedophiles worsens their mental state and could push them to become sexual predators. This is just a guess though.

      However, I do think “treatment” of pedophilia with generated CP should only be tried after conducting proper research into the actual effectiveness of it (maybe with general sex offenders and regular porn). In the end I think the top priority should be to minimize the amount of pedophiles who are also predators.

      • Ataraxia@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        8
        ·
        8 months ago

        The stigma against racism and sexism I guess a are also making people want to hurt these groups?

        • dsemy@lemm.ee
          link
          fedilink
          English
          arrow-up
          10
          arrow-down
          2
          ·
          8 months ago

          I believe racism and sexism are choices, while I think most pedophiles would prefer not to be pedophiles. If a pedophile doesn’t hurt anyone, why should people want to hurt him?

      • treefrog@lemm.ee
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        28
        ·
        8 months ago

        Sex offenders aren’t allowed to watch porn because the evidence suggests it doesn’t treat the behavior, but encoureges it.

        • Forbo
          link
          fedilink
          English
          arrow-up
          14
          ·
          8 months ago

          Having a hard time finding the evidence you mention, got a citation? First few articles I saw were actually advising against blanket pornography bans.

          • treefrog@lemm.ee
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            4
            ·
            8 months ago

            For example, Vega and Malamuth (2007) explored the role of pornography in the context of risk factors associated with sexual aggression within a group of male university students. They found that excessive pornography consumption added significantly to the prediction of sexual aggression.

            This review was unable to demonstrate that there was not a relationship between early exposure to pornography and sexual offending. It also consistently appears that men who sexually offend report less exposure to pornography and that exposure to pornography does not result in more harm being caused to the victim. The review suggests that there is not a consistent relationship between exposure to pornography and offending shortly after exposure.

            So a recent meta analysis has not found anything conclusive one way or the other. Operant conditioning does suggest a correlation (watching naked children while masturbating reinforces the neural pathways that link sexual arousal to kids).

            I did time for a drug offense and met a lot of sexual offenders. In my state, they’re not allowed to watch porn if they’re on parole.

            Anyway, the jury is out on if there is enough correlation between the two. But there’s definitely not evidence that I could find that letting pedophiles masterbate to pictures of children is helpful, as you suggest. Rather those images are simulated or not.

            https://www.sciencedirect.com/science/article/abs/pii/S1359178918302404

            • Forbo
              link
              fedilink
              English
              arrow-up
              10
              arrow-down
              1
              ·
              8 months ago

              I didn’t suggest shit, so please don’t put words in my mouth. Thanks for the citations though.

              • treefrog@lemm.ee
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                10
                ·
                edit-2
                8 months ago

                However, I do think “treatment” of pedophilia with generated CP should only be tried after conducting proper research into the actual effectiveness of it (maybe with general sex offenders and regular porn).

                Okay. Well this research has been tried with general sex offenders and it’s inconclusive rather it’s helpful or harmful.

                Sorry for reading your post suggestion that we try treating pedophiles with AI generated CP as a suggestion that CP would be helpful for pedophiles.

                • dsemy@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  8
                  ·
                  8 months ago

                  Maybe I could’ve written it more clearly, but I thought it was pretty obvious we shouldn’t try treating pedophiles this way if research shows it doesn’t work.

                • Forbo
                  link
                  fedilink
                  English
                  arrow-up
                  6
                  ·
                  8 months ago

                  Not my post, dude. Look at the people you are replying to.

    • captain_spork@startrek.website
      link
      fedilink
      English
      arrow-up
      42
      arrow-down
      14
      ·
      8 months ago

      Not that I think they should be included in LGBTQ+ but as someone who is bisexual I feel they’re not as far from us as you seemingly believe. Why wouldn’t we compare them? Both are sexual attractions that deviate from the norm. A pedophile didn’t choose to be a pedophile anymore than I chose to be bi.

      Growing up in a conservative household and town was a miserable experience for me. I hated myself, didn’t want to accept it, and felt utterly alone. Now think about how much worse it must be to realize you’re attracted to children. You have zero allies, you have zero people you can talk to, and a lot of people hate you merely for existing and/or want you dead. From where I stand their experience echoes my experience being LBGTQ+ quite heavily. Except over my lifetime LBGTQ+ acceptance grew quite rapidly and my husband was the light at the end of the tunnel. But pedophiles will never get that, probably ever. I feel nothing but sympathy for their situation.

      And what “serious psychiatric care” do you even think there is for it? Unless you also believe in gay conversion camps, we have nothing. We don’t even really know how sexuality actually works in the brain, we definitely aren’t anywhere close to being able to treat it.

      • cucumber_sandwich@lemmy.world
        link
        fedilink
        English
        arrow-up
        26
        arrow-down
        2
        ·
        8 months ago

        And what “serious psychiatric care” do you even think there is for it? Unless you also believe in gay conversion camps, we have nothing. We don’t even really know how sexuality actually works in the brain, we definitely aren’t anywhere close to being able to treat it.

        There’s programmes that focus on how to deal with it in a societally acceptable way, mainly on how not to become a predator. That’s a pretty good start.

      • Ataraxia@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        9
        ·
        8 months ago

        Excuse me what? I’m pansexual and fucking what? I’m nothing like a kiddy diddler. I don’t revel in the agony inflicted onto a child. These people get off on violence and destroying people. These victims are never the same again. That’s why parents catching someone doing this to a child will kill the perpetrator and nobody would fault them. Pedos are criminally insane if anything.

      • Jessica@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        17
        arrow-down
        29
        ·
        8 months ago

        Why wouldn’t we compare them?

        Really? What part of your sexuality, or mine, involves raping children? Nothing, right? One step back, what part of you being bisexual or my being trans involves harming anyone? That’s right, nothing.

        I don’t have the answer of how to deal with those that are attracted to children. But to suggest psychiatric care for those who have serious pathology is akin to gay conversion camps is gross.

        This is not some philosophical debate. Stop playing into the hands of bigots who are actively trying to paint LGBTQ+ folks, especially trans people at the moment, as “groomers” and “pedos”.

        We are not associated or comparable with pedophiles in any way, shape or form—full stop.

        • hikaru755@feddit.de
          link
          fedilink
          English
          arrow-up
          19
          arrow-down
          2
          ·
          8 months ago

          Tf are you talking about, unless being gay involves raping men, being pedo also doesn’t involve raping children. Even as a cishet non-pedo you will often encounter situations where acting on some attraction you feel would be anywhere from morally questionable to straight up illegal, and most of us manage to deal with that just fine. Of course that’s going to be tougher for someone whose entire experience consists of that, rather than just part of it, but nothing about being pedo forces you to become a child-raping piece of shit.

          Of course psychiatric care is important, but the point the other commenter was making is that it’s currently impossible to change anyone’s attraction, so it’s not a pathology that can be “cured” in this way. Any psychiatric care currently has to be aimed at helping people deal with being pedo without acting on it and also not developing any other psychological afflictions because of suppressing their attraction. Trying to “cure” the attraction itself would indeed be akin to gay conversion therapy: there’s no scientific evidence it works, and it’s going to do more harm than good.

        • Critical_Insight@feddit.uk
          link
          fedilink
          English
          arrow-up
          12
          arrow-down
          2
          ·
          edit-2
          8 months ago

          Most people in jail for abusing children are not pedophiles, but normal rapists and kids unfortunelately just happen to be easy targets. Even most pedophiles have morals. They know what they like is wrong and they wouldn’t want to hurt anyone. Just like most men aren’t rapists despite being turned on by women.

          Just imagine being born as someone with these urges. What a shitty fucking hand you’ve been dealt and as if that’s not bad enough, people want to murder you just for coming out and asking for help.

          • Ataraxia@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            4
            ·
            8 months ago

            If people stopped blurring consent lines that’d be great. Either you consent or you don’t. Fantasizing about rape legitimized incel’s attitude that women want to be raped. Nobody who is kentwlly healthy fantasizes about it. Therapy, not cnc.

        • eatthecake@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          20
          ·
          8 months ago

          Im sorry, they make it make sense by using disease. They can’t just say paedophiles are bad because they dont want to beleive in ‘bad’. It is a philosophy debate though, its evil versus sick. They’ll agree you’re not evil but you’ll get lumped into sick.

    • molave@reddthat.com
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      2
      ·
      edit-2
      8 months ago

      I’m trying to be better by not treating all pedophiles as child-abusers-in-waiting. Humans are capable of not acting on base immoral instincts.

    • thrawn@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      21
      ·
      8 months ago

      Off the bat, I wholly disagree with the idea that this should have been legal. That filth, even if AI generated, should be illegal for a multitude of reasons, one of them being that it allows those… urges to be practiced. I’m not one for the slippery slope fallacy but in situations where it could escalate to real child abuse, there should be zero tolerance and indulgence. If it’s a mental illness, they don’t need to fulfill that urge.

      That said, I think the people suggesting otherwise here are just looking at it from a perspective of numbers and nothing else, with little consideration of the significant downsides. The stance also ignores that offenders are likely in it for the taboo more than actual interest in kids— it sure seems like Epstein’s friends were mostly doing it because they could, and it was a new level of depravity to try. If you ignore all of these, AI generated filth could indeed reduce actual child abuse. That’s a good thing and theoretically comes with no additional suffering, right?

      I see this as naivety. Rude to imply about others here but better than CSAM apologism. It’s about the best I can think of, and I try to assume the best in people these days.

      Also to make clear why I think the slippery slope is valid here, making some form of that awful “interest” legal dramatically lowers the bar of entry. And unlike violent films that are accused of increasing violence, that filth will never have wider societal acceptance, so a legal but taboo on-ramp is more likely to lead to illegal and taboo viewing, then perhaps onto the real thing. Society should never be willing to risk that by indulging in their mental illness.

    • eatthecake@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      25
      ·
      8 months ago

      The psychologists have tried to normalize everything and sympathy for the devil is the greatest signal of one’s virtuous compassion. There is no evil anymore, all characters are grey, just ask the game of thrones fans about all the sadistic psychopaths in that story, none are truly bad.
      One day soon someone will build a robot version of their own child to rape and abuse and people will hail it as the perfect solution. And when that child finds out they will be told to take a chill pill because there is no harm done.
      Paepophiles are not sick, they are part of the natural variation of the species. Sometimes those variations are harmful and that needs to be addressed. If someone died and made me god i would murder suicide everyone who isnt a card carrying vegan pacifist. Yes, im a monster too. Failing that, i vote we name, shame and imprison the bad people. Yes, i beleive in good and bad. No im not religious.
      The left will never get anywhere with this moral nihilism.

          • surewhynotlem@lemmy.world
            link
            fedilink
            English
            arrow-up
            5
            ·
            8 months ago

            Blaming psychologists for random things. The disdain for empathy towards people you see as less than you. The generally nutty long-winded rant.

            If you’re not a scientologist, look them up. You’d fit in well.

            • eatthecake@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              4
              ·
              8 months ago

              I dont see paedophiles as less than me, they are entirely human. What i dont see is the notion that all humans are wonderful and good.

              • surewhynotlem@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                8 months ago

                Literally no one is saying pedos are wonderful or good. Literally no one. So if your point is that you agree with 100% of the planet, then yes, you are correct.

  • WuTang @lemmy.ninja
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    5
    ·
    8 months ago

    And by the way, kudos to fediverse instances, you do a crazy job. That’s the only good thing of this AI techno, detecting such crap and obliterate it. I don’t care about false positive. if there’s a false positive, OP could still try to defend their case if necessary.

  • I Cast Fist@programming.dev
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    One thing I have to ask for those that say pedos should seek psychological/psychiatric treatment: do you even know a professional that won’t immediately call the cops if you say “i have sexual desires for kids”?

    I wholly agree that this is something that should receive some form of treatment, but first the ones afflicted would have to know that they won’t be judged, labeled and exposed when they do so.

    • TheGalacticVoid@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      In the US, they will call the cops if they know you did something illegal, so it does require some form of secrecy from the patient.

  • phoenixz@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    I’m very conflicted on this one.

    Child porn one of those things that won’t go away if you prohibit it, like alcohol. It’ll just go underground and cause harm to real children.

    AI child pornography images, as disturbing as they might be, would serve a “need”, if you will, while not actually harming children. Since child pornography doesn’t appear to be one of those “try it and you’ll get addicted” things, I’m genuinely wondering if this would actually reduce the harm caused to real children. If so, I think it should be legal.

    • MrSqueezles@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 months ago

      I heard an anonymous interview with someone who was sickened by their own attraction to children. Hearing that person speak changed my perspective. This person had already decided never to marry or have kids and chose a career to that same end, low likelihood that kids would be around. Clearly, since the alternative was giving up on love and family forever, the attraction wasn’t a choice. Child porn that wasn’t made with children, comics I guess, was used to fantasize to prevent carrying through on those desires in real life.

      I don’t get it, why anyone would be attracted to kids. It’s gross and hurtful and stupid. If people suffering from this problem have an outlet, though, maybe fewer kids will be hurt.

    • clausetrophobic@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 months ago

      Normalisation in culture has effects on how people behave in the real world. Look at Japan’s sexualization of women and minors, and how they have huge problems with sexual assault. It’s not about whether or not real children are getting hurt, it’s about whether it’s morally right or wrong. And as a society, we’ve decided that CP is very wrong as a moral concept.

      • PhlubbaDubba@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 months ago

        Here’s the thing though, being too paranoid about normalization also makes the problem worse, because the truth is that these are people with severe mental problems, who in all likelihood want to seek professional help in most cases.

        The problem is the subject is SO taboo that even a lot of mental health professionals will chase them off like rabid animals when the solution is developing an understanding that can lead to a clinical treatment plan for these cases.

        Doing that will also help the CSAM problem too since getting people out of the alleyways and into professional help will shrink the market significantly, both immediately and overtime, reducing the amount of content that gets made, and as a result, the amount of children victimized to make that content.

        The key factor remains, we have to stop treating these people like inhuman monsters that deserve death and far worse whenever they’re found. They’re sick in the head souls who need robust mental health care and thought management strategies.

  • datavoid
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    42
    ·
    8 months ago

    ITT - Lemmy supports the pedos

    • Forbo
      link
      fedilink
      English
      arrow-up
      25
      arrow-down
      4
      ·
      8 months ago

      Can’t have any nuanced discussion here! Glad to see people such as yourself engaging in reductionism and shutting down thinking, because all interactions online have to be boiled down to five words TL;DR pithy sound bites.

      Leave the shit on Twitter, we can do better here.

      • datavoid
        link
        fedilink
        English
        arrow-up
        14
        arrow-down
        10
        ·
        8 months ago

        I actually typed out a more lengthy response to someone here already, read more responses/viewed the vote counts, and then wrote this top level comment pointing out how backwards this community’s views are. No one is directly supporting assaulting children, but as I wrote elsewhere: “why do we value the sexual gratification of pedos higher than the potential safety of children?”

        • MikuNPC@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          edit-2
          8 months ago

          Who the heck is proposing we value that? Everyone is saying we value the safety of real children which may entail keeping artificial CP legal.

          Also it’s a victimless crime so punishments dealt out are criticized heavily, and for good reason.

    • lolcatnip@reddthat.com
      link
      fedilink
      English
      arrow-up
      17
      arrow-down
      2
      ·
      edit-2
      8 months ago

      You clearly have chosen not to understand the assignments people are making in this thread. Either that or you’re choosing to misrepresent them. Literally nobody is supporting sexual assault of children or anyone else. But hey, don’t let that stop you from gloating about how morally superior you are.

    • SuddenlyBlowGreen@lemmy.world
      link
      fedilink
      English
      arrow-up
      17
      arrow-down
      22
      ·
      edit-2
      8 months ago

      Yeah, I thought he whole MAP bullshit died out, but apparently it’s alive and well on lemmy. It’s pretry sad.

    • Ismayil@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      11
      ·
      edit-2
      8 months ago

      Comments in this thread makes me laugh especially the ‘its not pedophila’ parts LOOOOOOOOOOOL