Neural Hashes, Safety Vouchers and More Fun Terms ExplainedApple’s tools for flagging child pornography and identifying explicit photos in kids’ messages cau...
@lorabe
link
7
edit-2
4M

And this is, kids, what we call.

Damage control.

Jedrax
link
34M

Federighi is a master of double speak.

@yxzi
link
44M

yeah sure, it might all seem harmless now, but doesn’t change the fact that it’s a door-opener for Apple

@nurkurz
link
34M

Aren’t they already doing all of that, but just on their servers? I’d rather have them doing it on my device. There is and will be no way for us to know what they are doing with our data on their servers. But there will be people analysing what ever they do on our devices.

Same with googles FLoC, do it on my device where I have a chance to turn it off.

The outcry really should have started when people started using devices that in reality are still owned by the company that made them.

@Metallinatus
link
74M

If they are scanning their servers, you can just not send photos to their servers, but if they are scanning your phone or computer, that is a dangerous new line they just crossed.

@lorabe
link
24M

This.

Jedrax
link
34M

Aren’t they already doing all of that, but just on their servers?

Yes they have. That being said, scanning does not occur if you have iCloud photos turned off. Which I recommend that all iCloud is turned off, such as backups and iMessage, if you really care about privacy.

@SrEstegosaurio
link
34M

If you REALLY care about privacy you don’t use an iPhone.

Jedrax
link
1
edit-2
4M

Kind of a disingenuous statement. I really care of privacy, but I also care about functionality. So I use an iPhone. Does that mean I don’t care about privacy? No, not at all.

Love_Monkey
link
34M

1: How do they train the AI? 2: If my 15 year old daughter sends nudes to her boyfriend, how will the AI know she is 15? Facial recognition? 3: What about parents taking pics of their kids in the bath or pool?

@uthredii
link
1
edit-2
4M

They don’t use an AI. They make a unique string of all the photo’s on your device and compare it to a unique string created from known child abuse images. It will only flag images that are allready in a database and there are safeguards to control what goes into that database.

This article explains it well: https://www.vox.com/recode/2021/8/10/22617196/apple-ios15-photo-messages-scanned

Edit: obviously there are still issues with this as there are probably ways for them to scan for other images.

Subscribe to see more stories about technology on your homepage


  • 0 users online
  • 14 users / day
  • 47 users / week
  • 139 users / month
  • 462 users / 6 months
  • 3.64K subscribers
  • 1.78K Posts
  • 4.82K Comments
  • Modlog