• Dudewitbow@lemmy.zip
      link
      fedilink
      arrow-up
      6
      ·
      9 months ago

      iirc meta on its own spends billions on content moderation, much more than other companies generally do. the problem is with content moderation, you only see the stuff they miss and not the stuff they already filtered out.

      on the topic of weeding out CSAM, an example of where a company gave up on it is nintendo suprisingly. when they had flipnote(a 3ds application where you can send post it nores to others) was used by predators in japan to lure children. Nintendo deemed it not moderatable and since then, removed. flipnote and no chat replacement has since then replaced it functionally.

      moderation is super tough and you can hear some really fucked up stories these people go through, even ones who have to go through more content (e.g people who have to filter out content in china due to government surveillance) has and how it affevted their lives.

      • BonesOfTheMoon@lemmy.worldOP
        link
        fedilink
        arrow-up
        4
        ·
        9 months ago

        I’ve reported probably a thousand pictures of swastika tattoos and shit they don’t remove, and people calling people homophobic slurs. I don’t think anyone reviews those reports.

        • Dudewitbow@lemmy.zip
          link
          fedilink
          arrow-up
          2
          ·
          9 months ago

          because on the list of stuff theyre filtering out, thats probably low on their list when compared to content like CSAM or actual murder, which gets them into legal problems if that kind of content gets wild.