Mastodon is a piece of software. I don’t see anyone saying “phpBB” or “WordPress” has a massive child abuse material problem.
Has anyone in the history ever said “Not a good look for phpBB”? No. Why? Because it would make no sense whatsoever.
I feel kind of a loss for words because how obvious it should be. It’s like saying “paper is being used for illegal material. Not a good look for paper.”
What is the solution to someone hosting illegal material on an nginx server? You report it to the authorities. You want to automate it? Go ahead and crawl the web for illegal material and generate automated reports. Though you’ll probably be the first to end up in prison.
I get what you’re saying, but due to federated nature, those CSAMs can easily spread to many instances without their admins noticing them. Having even one CSAM in your server is a huge risk for the server owner.
This could be a really big issue though. People can make instances for really hateful and disgusting crap but even if everyone defederates from them it’s still giving them a platform, a tiny tiny corner on the internet to talk about truly horrible topics.
Again if it’s illegal content publically available, officials can charge those site admins with crime of hosting. Everyone just has a duty to defederate.
Those corners will exist no matter what service they use and there is nothing Mastodon can do to stop this. There’s a reason there are public lists of instances to defederate. This content can only be prevented by domain providers and governments.
Again if it’s illegal content publically available, officials can charge those site admins with crime of hosting. Everyone just has a duty to defederate.
I’ve thought about building a truly decentralized app similar to lemmy, but the question if how to prevent things like CSAM from ending up on unwitting users’ devices is the main thing stopping me.
Lemmy has exactly the same problem, and the solution seems to be to defederate from instances that host that kind of content. That works, but it’s a lot of work for an admin, so we absolutely need better moderation tools to help detect unwanted content and block the source of it.
phpbb is not the host or the provider. Its just something you download and install on your server, with the actual service provider (You, the owner of the server and operator of the phpbb forum) being responsible for its content and curation.
Mastadon/Twitter/social media is the host/provider/moderator.
Mastodon is a piece of software. I don’t see anyone saying “phpBB” or “WordPress” has a massive child abuse material problem.
Has anyone in the history ever said “Not a good look for phpBB”? No. Why? Because it would make no sense whatsoever.
I feel kind of a loss for words because how obvious it should be. It’s like saying “paper is being used for illegal material. Not a good look for paper.”
What is the solution to someone hosting illegal material on an nginx server? You report it to the authorities. You want to automate it? Go ahead and crawl the web for illegal material and generate automated reports. Though you’ll probably be the first to end up in prison.
I get what you’re saying, but due to federated nature, those CSAMs can easily spread to many instances without their admins noticing them. Having even one CSAM in your server is a huge risk for the server owner.
I don’t see what a server admin can do about it other than defederate the instant they get reports. Otherwise how can they possibly know?
This could be a really big issue though. People can make instances for really hateful and disgusting crap but even if everyone defederates from them it’s still giving them a platform, a tiny tiny corner on the internet to talk about truly horrible topics.
Again if it’s illegal content publically available, officials can charge those site admins with crime of hosting. Everyone just has a duty to defederate.
Those corners will exist no matter what service they use and there is nothing Mastodon can do to stop this. There’s a reason there are public lists of instances to defederate. This content can only be prevented by domain providers and governments.
Again if it’s illegal content publically available, officials can charge those site admins with crime of hosting. Everyone just has a duty to defederate.
I’ve thought about building a truly decentralized app similar to lemmy, but the question if how to prevent things like CSAM from ending up on unwitting users’ devices is the main thing stopping me.
Lemmy has exactly the same problem, and the solution seems to be to defederate from instances that host that kind of content. That works, but it’s a lot of work for an admin, so we absolutely need better moderation tools to help detect unwanted content and block the source of it.
I just wish people wouldn’t post such nonsense.
Thats a dumb argument, though.
phpbb is not the host or the provider. Its just something you download and install on your server, with the actual service provider (You, the owner of the server and operator of the phpbb forum) being responsible for its content and curation.
Mastadon/Twitter/social media is the host/provider/moderator.
deleted by creator