Apple is being sued over its decision not to implement a system that would have scanned iCloud photos for child sexual abuse material (CSAM). The lawsuit
First: I’m not in any way intending to cast any negative light on the horrible shit the people suing went through.
But it also kinda feels like a lawyer convinced a victim they could get paid if they sued Apple, because Apple has lots of money.
If you really were serious about suing to force change, you’ve literally got:
X, who has reinstated the accounts of people posting CSAM
Google/Youtube, who take zero action on people posting both horrible videos AND comments on said videos routinely
Instagram/Facebook, which have much the same problem as X with slow or limited action on reported content
Apple, at least, will take immediate action if you report a user to them, so uh, maybe they should reconsider their best target, if their intent really is to remove content and spend some time on all the other giant corpos that are either literally actively doing the wrong thing, doing nothing, or are sitting there going ‘well, akshully’ at reports.
Google/Youtube, who take zero action on people posting both horrible videos AND comments on said videos routinely
I used to share an office with YouTube’s content review team at a previous job and have chatted with a bunch of them, so I can give a little insight on this side. For what it’s worth, YT does take action on CSAM and other abusive materials. The problem is that it’s just a numbers game. Those types of reports are human-reviewed. And for obvious reasons, it’s not exactly easy to keep a department like that staffed (turns out you really can’t pay people enough to watch child abuse for 8 hours a day), so the content quickly outnumbers the reviewers. Different types of infractions will have different priority levels, and there’s pretty much always a consistent backlog of content to review.
I don’t know all the details, but I know they had basically unlimited break time, as well as free therapy/counseling. The pay was also pretty decent, especially for a job that didn’t require physical labor or a specialized background.
They did have a pretty strict vetting process, because it was apparently not uncommon at all for people to apply to the job because they were either eager to see abusive content directly, or had an agenda they might try to improperly influence what content gets seen. Apparently they did social media deep dives that you had to consent to, to apply.
For Youtube I was very much talking specifically about how long and how little action they took on the kids-doing-gymnastics videos, even when it became abundantly clear that the target market was pedophiles, and the parents who kept posting these videos were, at the very least, complicit if not explicitly pimping their children out.
(If you have not seen and/or read up on this, save yourself the misery and skip it: it’s gross.)
It took them a VERY long time to take any meaningful action, even after the intent of and the audience to which it was being shown was clearly not people interested in gymnastics, and it stayed there for literal years.
Like, I have done anti-CSAM work and have lots and lots of sympathy for it because it’s fucking awful, but if you’ve got videos of children - clothed or not - and the comment section is entirely creeps and perverts and you just kinda do nothing, I have shocking limited sympathy.
Seriously - the comment section should have been used for the FBI to launch raids, because I 100% guarantee you every single person involved has piles and piles of CSAM sitting around and they were just ignored because it wasn’t explicit CSAM.
First: I’m not in any way intending to cast any negative light on the horrible shit the people suing went through.
But it also kinda feels like a lawyer convinced a victim they could get paid if they sued Apple, because Apple has lots of money.
If you really were serious about suing to force change, you’ve literally got:
Apple, at least, will take immediate action if you report a user to them, so uh, maybe they should reconsider their best target, if their intent really is to remove content and spend some time on all the other giant corpos that are either literally actively doing the wrong thing, doing nothing, or are sitting there going ‘well, akshully’ at reports.
Yep. All the money being wasted on this lawsuit could be spent catching actual producers and distributors of child porn.
Always follow the money. It shows what people’s true intentions are.
I used to share an office with YouTube’s content review team at a previous job and have chatted with a bunch of them, so I can give a little insight on this side. For what it’s worth, YT does take action on CSAM and other abusive materials. The problem is that it’s just a numbers game. Those types of reports are human-reviewed. And for obvious reasons, it’s not exactly easy to keep a department like that staffed (turns out you really can’t pay people enough to watch child abuse for 8 hours a day), so the content quickly outnumbers the reviewers. Different types of infractions will have different priority levels, and there’s pretty much always a consistent backlog of content to review.
While this article talks about Facebook, specifically, it’s very similar to what I saw with YouTube’s team, as well: https://www.theverge.com/2019/2/25/18229714/cognizant-facebook-content-moderator-interviews-trauma-working-conditions-arizona
I wonder what the package was, besides the salary. And the hiring requirements.
I don’t know all the details, but I know they had basically unlimited break time, as well as free therapy/counseling. The pay was also pretty decent, especially for a job that didn’t require physical labor or a specialized background.
They did have a pretty strict vetting process, because it was apparently not uncommon at all for people to apply to the job because they were either eager to see abusive content directly, or had an agenda they might try to improperly influence what content gets seen. Apparently they did social media deep dives that you had to consent to, to apply.
For Youtube I was very much talking specifically about how long and how little action they took on the kids-doing-gymnastics videos, even when it became abundantly clear that the target market was pedophiles, and the parents who kept posting these videos were, at the very least, complicit if not explicitly pimping their children out.
(If you have not seen and/or read up on this, save yourself the misery and skip it: it’s gross.)
It took them a VERY long time to take any meaningful action, even after the intent of and the audience to which it was being shown was clearly not people interested in gymnastics, and it stayed there for literal years.
Like, I have done anti-CSAM work and have lots and lots of sympathy for it because it’s fucking awful, but if you’ve got videos of children - clothed or not - and the comment section is entirely creeps and perverts and you just kinda do nothing, I have shocking limited sympathy.
Seriously - the comment section should have been used for the FBI to launch raids, because I 100% guarantee you every single person involved has piles and piles of CSAM sitting around and they were just ignored because it wasn’t explicit CSAM.
Just… gross, and poorly handled.