• t3rmit3@beehaw.org
    link
    fedilink
    arrow-up
    8
    ·
    3 days ago

    It allows processing data without decrypting it, which is great in terms of preventing someone else from snooping on it, but doesn’t change that Apple is retaining the ability to analyze the data content, which is the actual issue here.

    • Scrubbles@poptalk.scrubbles.tech
      link
      fedilink
      English
      arrow-up
      5
      ·
      3 days ago

      Reading between the lines, I guarantee they’re doing the same thing for CSAM protection. I think sex offenders caused this to happen, I believe they found out that they were using photos to host that horrid stuff, and apple can’t just ignore it, so I think we have them to thank

      • t3rmit3@beehaw.org
        link
        fedilink
        arrow-up
        4
        ·
        3 days ago

        I would be interested to see what lines you read between, because “identifying landmarks and points of interest” doesn’t sound like anything capable of identifying CSAM. I think you’re giving a big corporation a bunch of credit there is no reason to suspect it is owed, for an excuse they never professed.

        • Redjard@lemmy.dbzer0.com
          link
          fedilink
          arrow-up
          2
          ·
          3 days ago

          They did this exact thing for csam detection a while back, and were made to stop due to public outcry.
          It might have been analyzed locally and before encryption then though, still however without consent of the user and sending problematic results to apple.

          It is very realistic that here they would make the device decrypt and check the description against a database and make it send the file and description off for reporting when a match is found.

        • Scrubbles@poptalk.scrubbles.tech
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          3 days ago

          Apple killed it’s last version in August 2023 because it didn’t respect privacy. Where there’s object detection there’s csam detection. Which hey I think is good, and I wouldn’t expect an announcement about it. I just see how they did this, and this is exactly how I’d roll out a privacy focused csam detector if I was going to do it

          From August 2023, they killed the non privacy focused one: https://www.wired.com/story/apple-csam-scanning-heat-initiative-letter/

          • t3rmit3@beehaw.org
            link
            fedilink
            arrow-up
            4
            ·
            edit-2
            2 days ago

            Where there’s object detection there’s csam detection.

            This is not true at all. A model has to be trained to detect specific things. It does not automatically inherit the ability to detect CSAM just because it can detect other objects. The method it previously used for CSAM image detection (perceptual hashing) was killed for bad privacy implementation, and the article specifically notes that

            Tsai argues Apple’s approach is even less private than its abandoned CSAM scanning plan “because it applies to non-iCloud photos and uploads information about all photos, not just ones with suspicious neural hashes.”

            So even images that the local detection model doesn’t match to CSAM would be being uploaded to their servers.

            Apple killed it’s last version in August 2023 because it didn’t respect privacy.

            It was also not that good.