Today I posted a picture of a stamp with an animal in it and they said the picture contained nudity and made me take it down, but I reported a photo of a guy with a fully visible swastika tattoo and they said that’s fine.

I’d like to start a Lemmy community with photos of stuff that they refuse to remove called FacebookSaysItsFine.

    • dustyData@lemmy.world
      link
      fedilink
      arrow-up
      10
      ·
      1 year ago

      I would suggest the opposite. Perhaps a Facebook doesn’t allow this, community. Too much risk of attracting trolls and monsters.

      That said. The FBI says that Facebook, Instagram, Twitter et al. They all contain a not insignificant amount of CSAM at any given point in time. The fact just never gets reported by press because they’re normalized platforms by the public. Only the fediverse gets that sort of negative attention in the press because it’s the disruptive outsider platform. When by both proportion and volume, almost all other platforms have a worse issue with awful content that regularly flies under the radar because they are big corporations.

    • ThePowerOfGeek@lemmy.world
      link
      fedilink
      arrow-up
      8
      ·
      1 year ago

      It would definitely require some very active moderation and clearly-defined community rules. But it sounds like a great idea for a Lemmy community, if you have the time.

      • thantik@lemmy.world
        link
        fedilink
        arrow-up
        6
        ·
        1 year ago

        Cloudflare has free CSAM scanning tools available - they really just need to implement it.

        • Rai@lemmy.dbzer0.com
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          When did “CP” become “CSAM”?

          If you want to change the acronym, wouldn’t “CR” make more sense?

          • Cracks_InTheWalls@sh.itjust.works
            link
            fedilink
            arrow-up
            3
            ·
            edit-2
            1 year ago

            'cause porn is made with consenting adults. CSAM isn’t porn. CR is typically what’s depicted in CSAM (assuming that R stands for rape), but there’s two (or more) separate though closely related crimes here. That and SA (sexual assault) covers a wider breadth of activities, which is good if a person wants to quibble over the term rape when regardless something horrific happened to a kid and videos/images of said horrific thing is now getting shared among pedophiles.

            Will note I’ve only seen CSAM used when I started using Lemmy, so I’m not really sure when people started using the term over CP. I’m personally for it - it more accurately describes what it is, and while I haven’t seen this term in the wild SAM to describe video or images of non-consenual sex acts among adults is good too.

    • csgraves@lemmy.world
      link
      fedilink
      arrow-up
      6
      ·
      1 year ago

      I worry about this on fediverse stuff. I made the mistake of looking at the links from a person who commented on anti trans legislation and let me just say yikes!

      The link was to something trying to legitimize the identity of “map.”

      NOPE.

      I deleted my comments and blocked the sick bastard.

    • Sylver@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      edit-2
      1 year ago

      Try to stay apolitical and you won’t attract those trolls as early. Which I now realize may be difficult, considering many of the posts would be calling out Nazi scum…