Highlighting the recent report of users and admins being unable to delete images, and how Trust & Safety tooling is currently lacking.

  • Sean Tilley@lemmy.worldOP
    link
    fedilink
    English
    arrow-up
    28
    arrow-down
    3
    ·
    edit-2
    10 months ago

    While I think you’re correct about it ultimately being their project, and that users are in no place to demand or expect anything, this thing takes on whole other dimensions once a project is all about building a social platform. Particularly one where volunteers host part of the network themselves.

    It’s one thing to look at some random demand to write everything in a P2P architecture because DNS is too centralized. When I worked on Diaspora, I literally saw people demand stuff like that, and laughed it off. I’m trying to build a platform that exists today, not some pixie dream bullshit compromised of academic circle-jerking.

    But when it comes to basic table stakes for participating in a network that already exists, things change a bit. This is especially true when you’re connecting to a global network that has:

    • Hate Speech
    • Targeted Harassment Campaigns
    • Child Pornography
    • Extreme Gore and Violence

    Suddenly, it makes a lot of sense to say “you know what, admins are going to want to filter this shit out, maybe it’s reasonable for them to have some tools and fixtures that are part of core.”

    Unfortunately, these devs are the kind of people who scream angrily when someone says “Hey, this thing doesn’t actually respect local image deletes / GDPR stuff / content deletion on account deletion”. To me, that’s fucking insane.