Defend Yourself against AI Impostor Scams with a Safe Word
Briefly

First, swindlers fake familiarity or authoritymaybe by stealing the identity of a friend or relative or claiming to be a bank representative or a federal agent. Then, in that guise, they call, text or e-mail you and attempt to take your money. And now artificial intelligence has larded these scams with an additional layer of duplicity: inexpensive voice-cloning services that an impersonator can easily abuse to make deceptiveand astonishingly convincingphone calls in another person's voice.
If there were a golden rule to thwart AI-infused phone scams, it might be something like this: Online or on the phone, treat your family members and friends as though they were an e-mail log-in page. Make up a passcodea safe word or private phraseand share it with them in person. Memorize it. If they call you in alarm or under unusual pressure, especially if those concerns are connected to requests for money, ask for the code to verify who is on the other end of the line.
Adopting a computerlike countermeasure for a problem enabled by computer algorithms is admittedly an unnatural practice. It is a human impulse to trust a family member's voice, said Jennifer DeStefano, a target of an attempted scam, to a Senate judiciary subcommittee last June. Perpetrators had called her phone, claime
Read at www.scientificamerican.com
[
add
]
[
|
|
]