Apple is being sued over its decision not to implement a system that would have scanned iCloud photos for child sexual abuse material (CSAM). The lawsuit
I think I was the only one who actually read the paper and didn’t go “REEEE muh privacy!!!” after seeing the headline.
Did you also read the difference in how Apple was trying to go about it and how literally everyone else was going about it?
Apple wanted to scan your files on your device, which is a huge privacy issue and a huge slippery slope (and a backdoor built in).
The entire industry scans files when they are off your private device and on their own personal computers. So your privacy is protected here, and no backdoor built in.
Apple just had a fit and declared that if they can’t backdoor and scan your files on your own device then they just won’t try anything, even the most basics. They could just follow the lead of anyone else and scan iCloud files, but they refuse to do that. That was the difference.
First of all: You could turn off the local scanning by turning off iCloud sync - which would’ve sent the images to the cloud for scanning anyway. That’s it, nothing else, nobody at Apple would’ve touched a single super-private file on your device.
The local scanning required MULTIPLE (where n>3, they didn’t say the exact number for obvious reasons) matches to known and human-verified CSAM. This database is the one that would’ve been loaded from iCloud if you had it turned on. This is the exact same database all cloud providers are using for legal reasons. Some have other algos on top - at least Microsoft had an is_penis algorithm that shut down a German dude’s whole Live account for his kid’s pics being on OneDrive.
After the MULTIPLE matches (you can’t get flagged by “accidentally” having one on your phone, nor would pics of your kids in the pool trigger anything) a human checker would have had enough data to decrypt just those images and see a “reduced resolution facsimile” (Can’t remember the exact term) of the offending photos. This is where all of the brainpower used to create false matches would’ve ended up in. You would’ve had to create multiple matches of known CP images that looks enough like actual CP for the human to make an erroneous call multiple times to trigger anything.
If after that the human decided that yep, that’s some fucked up shit, the authorities would’ve been contacted.
Yes, a Bad Government could’ve forced Apple to add other stuff in the database. (They can do it right now for ALL major cloud storage providers BTW) But do you really think people wouldn’t have been watching for changes in the cloud-downloaded database and noticed any suspicious stuff immediately?
Also according to the paper the probability of a false match was 1 in 1 trillion accounts - and this was not disputed even by the most hardcore activists btw.
tl;dr If you already upload your stuff to the cloud (like iOS does automatically) the only thing that would’ve changed is that nobody would’ve had a legit reason to peep at your photos in the cloud “for the children”. But if you’ve got cloud upload off anyway, nothing would’ve changed. So I still don’t understand the fervour people had over this - the only reason I can think of is not understanding how it worked.
So I still don’t understand the fervour people had over this - the only reason I can think of is not understanding how it worked.
Or that it was a built in backdoor running in your device.
The difference is what happens on your own device should be in your control. Once it leaves your device then it’s not in your control. Which is where the entire issue was. It doesn’t matter if I toggle a switch on whether to allow upload or not, the fact it was happening on my device was the issue.
It’s not a very good back door if you have an explicit easy to use switch to turn it off.
And even without this feature on your device, they don’t need to use a “back door”. They’ll just go through your front door that’s wide open and can’t be closed because of “the children”
If you want to “own” your phone, there are other manufacturers than Apple that allow you to lock it down like Fort Knox (or whatever you deem secure)
You don’t understand or you refuse to acknowledge this is a back door into your device an Apple is actively scanning your files meaning your device is now compromised.
Or are you shilling for anti-privacy?
My device, my files. I don’t want your scanning.
What’s so hard to grok about that unless you are anti-privacy?
There’s a difference here in principle. Exemplified by the answer to this question:
“Do you expect that things you store somewhere are kept private?”
Where, Private means: “No one looks at your things.”
Where, No One means: not a single person or machine.
This is the core argument. In the world, things stored somewhere are often still considered private. (Safe Deposit box).
People take this expectation into the cloud. Apple, Google, Microsoft, Box, Dropbox etc - only made their scanning known publicly _after they were called out. They allowed their customers to _assume their files were private.
Second issue: Does just a simple machine looking at your files count as unprivate? And what if we Pinky Promise to make the machine not really really look at your files, and only like squinty eyed.
For many, yes this also counts as unprivate.
Its the process that is problematic. There is a difference between living in a free society, and one in which citizens have to produce papers when asked. A substantial difference. Having files unexamined and having them examined by an ‘innocuous’ machine, are substantial differences. The difference _is privacy. On one, you have a right to privacy. In the other you don’t.
an aside…
In our small village, a team sweeps every house during the day while people are out at work. In the afternoon you are informed that team found illegal paraphernalia in your house. You know you had none. What defense do you have?
The files WILL be scanned the second they leave your device to any major cloud.
There are services with e2e and you can encrypt before uploading to those who can’t.
Realistically speaking, if this was implement anybody with CSAM would just not use iPhones, and all scanning would be done on everyone else.
Then, once implemented and with less fanfare some authoritarian regimes (won’t say any to not upset the tankies) can ask apple to scan for other material too… And as it’s closed source we wouldn’t even know that the models are different by country.
Did you also read the difference in how Apple was trying to go about it and how literally everyone else was going about it?
Apple wanted to scan your files on your device, which is a huge privacy issue and a huge slippery slope (and a backdoor built in).
The entire industry scans files when they are off your private device and on their own personal computers. So your privacy is protected here, and no backdoor built in.
Apple just had a fit and declared that if they can’t backdoor and scan your files on your own device then they just won’t try anything, even the most basics. They could just follow the lead of anyone else and scan iCloud files, but they refuse to do that. That was the difference.
There was no “huge privacy issue”.
First of all: You could turn off the local scanning by turning off iCloud sync - which would’ve sent the images to the cloud for scanning anyway. That’s it, nothing else, nobody at Apple would’ve touched a single super-private file on your device.
The local scanning required MULTIPLE (where n>3, they didn’t say the exact number for obvious reasons) matches to known and human-verified CSAM. This database is the one that would’ve been loaded from iCloud if you had it turned on. This is the exact same database all cloud providers are using for legal reasons. Some have other algos on top - at least Microsoft had an is_penis algorithm that shut down a German dude’s whole Live account for his kid’s pics being on OneDrive.
After the MULTIPLE matches (you can’t get flagged by “accidentally” having one on your phone, nor would pics of your kids in the pool trigger anything) a human checker would have had enough data to decrypt just those images and see a “reduced resolution facsimile” (Can’t remember the exact term) of the offending photos. This is where all of the brainpower used to create false matches would’ve ended up in. You would’ve had to create multiple matches of known CP images that looks enough like actual CP for the human to make an erroneous call multiple times to trigger anything.
If after that the human decided that yep, that’s some fucked up shit, the authorities would’ve been contacted.
Yes, a Bad Government could’ve forced Apple to add other stuff in the database. (They can do it right now for ALL major cloud storage providers BTW) But do you really think people wouldn’t have been watching for changes in the cloud-downloaded database and noticed any suspicious stuff immediately?
Also according to the paper the probability of a false match was 1 in 1 trillion accounts - and this was not disputed even by the most hardcore activists btw.
tl;dr If you already upload your stuff to the cloud (like iOS does automatically) the only thing that would’ve changed is that nobody would’ve had a legit reason to peep at your photos in the cloud “for the children”. But if you’ve got cloud upload off anyway, nothing would’ve changed. So I still don’t understand the fervour people had over this - the only reason I can think of is not understanding how it worked.
Or that it was a built in backdoor running in your device.
The difference is what happens on your own device should be in your control. Once it leaves your device then it’s not in your control. Which is where the entire issue was. It doesn’t matter if I toggle a switch on whether to allow upload or not, the fact it was happening on my device was the issue.
It’s not a very good back door if you have an explicit easy to use switch to turn it off.
And even without this feature on your device, they don’t need to use a “back door”. They’ll just go through your front door that’s wide open and can’t be closed because of “the children”
If you want to “own” your phone, there are other manufacturers than Apple that allow you to lock it down like Fort Knox (or whatever you deem secure)
Which one is it? This isn’t Schrodinger’s iPhone.
You don’t understand or you refuse to acknowledge this is a back door into your device an Apple is actively scanning your files meaning your device is now compromised.
Or are you shilling for anti-privacy?
My device, my files. I don’t want your scanning.
What’s so hard to grok about that unless you are anti-privacy?
The files WILL be scanned the second they leave your device to any major cloud.
If they don’t leave your device, then turning off iCloud (and thus the “back door”) wouldn’t have had any impact on you.
Just clearing up the argument.
There’s a difference here in principle. Exemplified by the answer to this question: “Do you expect that things you store somewhere are kept private?” Where, Private means: “No one looks at your things.” Where, No One means: not a single person or machine.
This is the core argument. In the world, things stored somewhere are often still considered private. (Safe Deposit box). People take this expectation into the cloud. Apple, Google, Microsoft, Box, Dropbox etc - only made their scanning known publicly _after they were called out. They allowed their customers to _assume their files were private.
Second issue: Does just a simple machine looking at your files count as unprivate? And what if we Pinky Promise to make the machine not really really look at your files, and only like squinty eyed. For many, yes this also counts as unprivate. Its the process that is problematic. There is a difference between living in a free society, and one in which citizens have to produce papers when asked. A substantial difference. Having files unexamined and having them examined by an ‘innocuous’ machine, are substantial differences. The difference _is privacy. On one, you have a right to privacy. In the other you don’t.
an aside…
In our small village, a team sweeps every house during the day while people are out at work. In the afternoon you are informed that team found illegal paraphernalia in your house. You know you had none. What defense do you have?
There are services with e2e and you can encrypt before uploading to those who can’t.
Realistically speaking, if this was implement anybody with CSAM would just not use iPhones, and all scanning would be done on everyone else.
Then, once implemented and with less fanfare some authoritarian regimes (won’t say any to not upset the tankies) can ask apple to scan for other material too… And as it’s closed source we wouldn’t even know that the models are different by country.
I’m amazed it’s taken so long… I think I’m on my third Android phone since they first announced it and I said “fuck no”.