• phillaholic@lemm.ee
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    1
    ·
    1 year ago

    It was client side scanning if you chose to upload those files to iCloud. The equivalent of having your ID checked before you enter a club.

    • rikonium@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      19
      ·
      edit-2
      1 year ago

      Yes, however my (Others may have other concerns, this is just off the top of my head) chief concern was the breaking a major barrier - in that explicitly user-hostile code would be running on the device itself, one I own. I’d say it’s more of the equivalent of club employees entering your home to check your ID prior to, or during your club visit, and using your restroom/eating a snack while they’re there. (scanning would use “your” device’s resources)

      There’s also the trivial nature of flipping the require_iCloud_photos=“true” value to “false” whether by intention or by accident. I have an open ticket with Apple support where my Apple Maps saved locations, favorites, guides, Home, reports, reviews ALL vanished without a trace. Just got a callback today saying that engineering is aware of the problem and that it’s expected to be resolved in the next iOS update. I’m the meantime, I’m SOL, so accidents and problems can and do happen, nor is Apple the police.

      And on top of that there’s also concerns of upstream perversion of the CSAM database for other purposes - after all, who can audit it to ensure it’s use for CSAM exclusively and who can add to it? Will those images from the device and database be pulled out for trials or would it be a “trust the machine, the odds of false positives are x%” situation? (I believe those questions might have been already answered when the controversy was flying but there’s just a lot of cans of worms waiting to be opened with this, as well as Apple being pressured to scan for more things once the technology has been made.)

      • phillaholic@lemm.ee
        link
        fedilink
        English
        arrow-up
        6
        ·
        1 year ago

        The CSAM database isn’t controlled by Apple. It’s already in use practically everywhere. Apple tried to compromise between allowing private encrypted image storage at scale and making sure they aren’t a hot bed for CSAM. Their competitors just keep it unencrypted and scan it for content, which last time I checked is worse 🤷‍♂️

        • Natanael@slrpnk.net
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 year ago

          But Apple still fetches that list of hashes and can be made to send an alternative list to scan for

          • phillaholic@lemm.ee
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            It’s not very useful for much else. It only find known copies of existing CSAM. It doesn’t detect new ones. Governments could already force Apple to do whatever they want, so it’s a keep to say this is going to do much more.

            • mahony@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              1 year ago

              You go way out of your way to lick Apples boot here. With comparing hashes to whatever Apple wants/is told to, you can profile everyone, find leaked material the gov doesnt want you to have and so on. The fact that people just accept it, or endorse it is beyond me, but again, after the last 3 years I came to the conclusion that most people are scared to be free.

              • phillaholic@lemm.ee
                link
                fedilink
                English
                arrow-up
                2
                ·
                1 year ago

                While scanning for leaked government documents is the first thing I’ve heard that could be a problem for whistleblowers, I’ll point out this scanning tech is already in use in major cloud platforms and no government has forced anyone to do it. Having a database of all government documents like that wouldn’t be trivial to put together either. It’s just not practical to be used that way.

                I don’t care that it was Apple who did this, it presents a legitimate answer to E2E encryption of data while cutting many government arguments off at the legs. Without an answer we are closer to E2E being made illegal then we are nothing happening.

                • mahony@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  1 year ago

                  Yes, thats why I dont use cloud and have a degoogled android. The problem is that this is a slippery slope. I can say I dont mind because it doesnt affect me, but step by step they outlaw anything else, even custom roms and alternative app stores. Either people are against it, or this will get much worse down the line.

                  • phillaholic@lemm.ee
                    link
                    fedilink
                    English
                    arrow-up
                    2
                    ·
                    1 year ago

                    I don’t think it’s a slippery slope. That ship set sailed when we started putting our data on other people’s computers. Your situation is extremely niche, not many are going to go through that effort.

    • Uriel238 [all pronouns]@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      11
      ·
      edit-2
      1 year ago

      Let’s say my grandson came to a realization that he was actually my granddaughter. She grows her hair long. She practices with make-up and gets some cute dresses and skirts, and is totally into it.

      Now Apple knows.

      Any any law-enforcement interests that think its wrong or abusive by fiat can force Apple to let them know.

      Same, if my grandkid decides they are pagan and go from wearing a cross to wearing a pentacle.

      Same if law enforcement notices that they are caramel colored, that mom is Germanic pale and dad is dark brown.

      The US is a society in which neither law nor law enforcement are on our side, and can at any time decide that arbitrary life shit is worthy of sending a SWAT team to collect us. And if the GOP is determined to make it worse.

      • Kelsenellenelvial@lemmy.ca
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        1
        ·
        1 year ago

        Not really. The plan that Apple backpedaled on was to compare hashes photos on device to hashes of known CSAM material. They wouldn’t see any user-generated photos unless they was a hash collision. Other companies have been known to report false positives on user-generated photos and delete accounts with no process to recover them.

        • phillaholic@lemm.ee
          link
          fedilink
          English
          arrow-up
          4
          ·
          1 year ago

          They published a white paper on it. It would have taken many detected examples before they did anything about it. It’s not strictly a hash as it’s not looking for exact copies but similar ones. Collisions have been proven, but afaik they are all reverse engineered. Just Grey blobs of nonsense that match CSAM examples. I don’t recall hearing about someone’s random taken photo matching with anything, but correct me if I’m wrong.

          • Kelsenellenelvial@lemmy.ca
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            True, it’s hash-like in that the comparison is using some mathematic representation of the source material. It was intended to be a little fuzzy so it would still catch minor alterations like cropping, watermarks, rendering to a new format, etc…

            The example I heard of was someone that was using an app for a remote doctors appointment. The doctor requested photos of the issue, a rash in the genital area of a minor, supposedly one included an adult hand touching the area involved. That photo ended up in Google’s cloud service where it was flagged, reported to law enforcement, and that users while Google account was frozen. The investigation quickly confirmed the innocence of the photo, and provided official documentation of such, but last I heard Google would not release the account.

            • phillaholic@lemm.ee
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              Google has unencrypted access to your files to do whatever they want with, do we know this was the same CSAM system or one of Google internal ones? Google Photos does their face and object scanning on the cloud where apple does it on device.

        • Uriel238 [all pronouns]@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          4
          ·
          1 year ago

          This assumes the program stays that way. Much the way Google promised no human would look at (or be able to look at) the data set, we dont have an external oversight entity watching over Apple.

          And then there’s the matter of mission creep, much the way the NSA PRISM program was supposed to only deal with foreign threats to national security (specifically Islamist terrorism) yet now it tells local precincts about large liquidatable assets that can be easily seized.

          Even if it only looks as hash codes, it means law enforcement can add its own catalog of hashes to isolate and secure, say content that is embarrassing to law enforcement, like videos of police gunning down unarmed, unresisting suspects in cold blood, which are challenged only when the event is captured on a private smartphone.

    • regalia@literature.cafe
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 year ago

      You’re paying to reserve some space in their cloud to store your encrypted bits. If you exchange money for that space, then you’re entitled for it to be encrypted and private.

      • phillaholic@lemm.ee
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        Find me any place you don’t own that you can store your stuff that has no restrictions on what you can store there.

        • regalia@literature.cafe
          link
          fedilink
          English
          arrow-up
          4
          ·
          1 year ago

          Something like Proton Cloud, or a self hosted Nextcloud instance. If it’s encrypted, it’s nobody’s business.

          • phillaholic@lemm.ee
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            1
            ·
            1 year ago

            Not according to their terms of service

            You agree not to use your Account or the Services for any illegal or prohibited activities. Unauthorized activities include, but are not limited to: Disrupting the Company’s networks and Servers in your use of the Services; Accessing/sharing/downloading/uploading illegal content, including but not limited to Child Sexual Abuse Material (CSAM) or content related to CSAM;

            • regalia@literature.cafe
              link
              fedilink
              English
              arrow-up
              6
              ·
              1 year ago

              It’s e2ee, that’s just for them to legally cover their ass. They have zero knowledge of what’s uploaded.

              • phillaholic@lemm.ee
                link
                fedilink
                English
                arrow-up
                3
                ·
                1 year ago

                Proton hasn’t really gotten pushback yet as they are small. If Pedophiles start utilizing Proton for CSAM I guarantee you things will change or they will shut down. Another full e2e provider, can’t recall the name at the moment, just ended up shutting their service down when governments started coming after them. They aren’t the guys from the PirateBay.

                • regalia@literature.cafe
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  1 year ago

                  That’s an attack on e2ee, not on any specific provider. CSAM is just one of the ways they use to criminalize encryption.

                  • phillaholic@lemm.ee
                    link
                    fedilink
                    English
                    arrow-up
                    2
                    ·
                    1 year ago

                    It’s also a real world problem, and positioning yourself as a safe haven for it isn’t going to work. Apple was trying to let you have E2E while simultaneously destroying many Governments main objection to it. Now we are back to square one, and if providers refuse to work with governments, governments will attack E2E encryption.