• HiddenLayer5
    link
    fedilink
    English
    arrow-up
    7
    ·
    edit-2
    1 year ago

    AI scams are already rampant where they pretend to be a loved one asking for help (read: “I’m in a bad situation right now can you send me as much money as you can?”) And unsurprisingly it’s unreasonably effective especially on older people.

    Just a reminder that the tech companies absolutely do not see the above as an issue BTW, in fact all they seem to do is tacitly endorse it by advertising that you can use their service to clone people and “bring them to life” virtually and stuff. Because they’re still making money when you use the AI (not to mention they collect and retain the training data you give them, with or without the subject’s consent) and it’s not like it’s that easy for investigators to tell which AI was responsible for a particular scam campaign so there’s really no risk to their reputation at all.

    I’m serious when I say this: If you have elderly or otherwise less tech inclined family members and especially if you have them and your voice and/or photos are publicly available online, set up some kind of password that you have to get right before they send you money, absolutely no exceptions no matter how distressed “you” look or sound. It can be as simple as a word or phrase, or pick a specific shared memory that people outside your family don’t know about that you’ll always mention before asking for money. Do this in advance and tell them that AI can now convincingly replicate human speech and even photos and videos, and that if “you” don’t know the password then they should hang up/block the account immediately and not respond further. You might even want to practice with them if they might forget. The vast majority of these types of scammers are just scraping the internet for information and have no idea who either of you are, so even a simple check like this should be able to significantly reduce the risk of scams.