More than 140 Facebook content moderators have been diagnosed with severe post-traumatic stress disorder caused by exposure to graphic social media content including murders, suicides, child sexual abuse and terrorism.
The moderators worked eight- to 10-hour days at a facility in Kenya for a company contracted by the social media firm and were found to have PTSD, generalised anxiety disorder (GAD) and major depressive disorder (MDD), by Dr Ian Kanyanya, the head of mental health services at Kenyatta National hospital in Nairobi.
The mass diagnoses have been made as part of lawsuit being brought against Facebook’s parent company, Meta, and Samasource Kenya, an outsourcing company that carried out content moderation for Meta using workers from across Africa.
The images and videos including necrophilia, bestiality and self-harm caused some moderators to faint, vomit, scream and run away from their desks, the filings allege.
Companies keep talking about replacing employees with AI yet they keep up this fuckery. Y’all’s AI models are either good enough to handle this shit or shouldn’t be used as a bad-faith bargaining chip. If there were ever a job that should be eliminated from human labor, NSFL content moderating seems like the perfect contender.
Rather a cycnical take here, but perhaps that’s what’s coming and these jobs are going to be made redundant shortly so they’re filing a claim while they still can.
I have heard that folks from African countries who are hired to train those AI models are also reporting abuses. So imo that’s not really a solution either
Right, riiiiiight… I forgot about that part. Make AIs train each other. What could go wrong?!
I’m pretty sure this is actually referring to work done by humans long before the “ai” fad
I think you’re right. I thought this was a new story making the rounds