Not a good look for Mastodon - what can be done to automate the removal of CSAM?

  • whatsarefoogee@lemmy.world
    link
    fedilink
    arrow-up
    152
    arrow-down
    3
    ·
    1 year ago

    Mastodon is a piece of software. I don’t see anyone saying “phpBB” or “WordPress” has a massive child abuse material problem.

    Has anyone in the history ever said “Not a good look for phpBB”? No. Why? Because it would make no sense whatsoever.

    I feel kind of a loss for words because how obvious it should be. It’s like saying “paper is being used for illegal material. Not a good look for paper.”

    What is the solution to someone hosting illegal material on an nginx server? You report it to the authorities. You want to automate it? Go ahead and crawl the web for illegal material and generate automated reports. Though you’ll probably be the first to end up in prison.

    • redcalcium@lemmy.institute
      link
      fedilink
      arrow-up
      41
      arrow-down
      8
      ·
      1 year ago

      I get what you’re saying, but due to federated nature, those CSAMs can easily spread to many instances without their admins noticing them. Having even one CSAM in your server is a huge risk for the server owner.

      • MinusPi (she/they)@pawb.social
        link
        fedilink
        arrow-up
        30
        ·
        1 year ago

        I don’t see what a server admin can do about it other than defederate the instant they get reports. Otherwise how can they possibly know?

        • krimsonbun
          link
          fedilink
          arrow-up
          3
          arrow-down
          8
          ·
          1 year ago

          This could be a really big issue though. People can make instances for really hateful and disgusting crap but even if everyone defederates from them it’s still giving them a platform, a tiny tiny corner on the internet to talk about truly horrible topics.

          • andruid
            link
            fedilink
            arrow-up
            16
            ·
            1 year ago

            Again if it’s illegal content publically available, officials can charge those site admins with crime of hosting. Everyone just has a duty to defederate.

          • priapus@sh.itjust.works
            link
            fedilink
            arrow-up
            14
            ·
            1 year ago

            Those corners will exist no matter what service they use and there is nothing Mastodon can do to stop this. There’s a reason there are public lists of instances to defederate. This content can only be prevented by domain providers and governments.

          • andruid
            link
            fedilink
            arrow-up
            0
            ·
            1 year ago

            Again if it’s illegal content publically available, officials can charge those site admins with crime of hosting. Everyone just has a duty to defederate.

    • sugar_in_your_tea@sh.itjust.works
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      I’ve thought about building a truly decentralized app similar to lemmy, but the question if how to prevent things like CSAM from ending up on unwitting users’ devices is the main thing stopping me.

      Lemmy has exactly the same problem, and the solution seems to be to defederate from instances that host that kind of content. That works, but it’s a lot of work for an admin, so we absolutely need better moderation tools to help detect unwanted content and block the source of it.

      I just wish people wouldn’t post such nonsense.

    • Dubious_Fart
      link
      fedilink
      arrow-up
      2
      arrow-down
      3
      ·
      1 year ago

      Thats a dumb argument, though.

      phpbb is not the host or the provider. Its just something you download and install on your server, with the actual service provider (You, the owner of the server and operator of the phpbb forum) being responsible for its content and curation.

      Mastadon/Twitter/social media is the host/provider/moderator.