New Mexico is seeking an injunction to permanently block Snap from practices allegedly harming kids. That includes a halt on advertising Snapchat as “more private” or “less permanent” due to the alleged “core design problem” and “inherent danger” of Snap’s disappearing messages. The state’s complaint noted that the FBI has said that “Snapchat is the preferred app by criminals because its design features provide a false sense of security to the victim that their photos will disappear and not be screenshotted.”

  • Chozo@fedia.io
    link
    fedilink
    arrow-up
    83
    arrow-down
    2
    ·
    2 months ago

    “Heather” also tested out Snapchat’s search tool, finding that “even though she used no sexually explicit language, the algorithm must have determined that she was looking for CSAM” when she searched for other teen users.

    But literally in just the previous paragraph:

    Posing as “Sexy14Heather,” the investigator swapped messages with adult accounts, including users who "sent inappropriate messages and explicit photos.

    Gee, I wonder how the algorithm could’ve possibly suggested these users. What a mystery.

    I’m not defending Snapchat here - they’re a scumbag company with a scumbag product and they should be held responsible for enabling the sharing of CSAM on their platform - but it doesn’t just match you with random predators out of thin air. They went in with specific keywords in their username and a pattern of account engagement.

    • some_guy@lemmy.sdf.orgOP
      link
      fedilink
      arrow-up
      27
      ·
      2 months ago

      That’s nuance that I hadn’t considered and I appreciate you pointing it out. I’m not on any of these sharing platforms so I have no idea what they’re like and that made it easy to overlook this detail which is probably pretty relevant.