Scammers are incorporating artificial intelligence into old schemes, putting a technological twist on so-called “grandparent scams.”
“With new technology coming up, there are many scammers who are changing their tactics and using A.I. as a way to further legitimize themselves to convince people to hand over money,” said Kristin Matthews of the Better Business Bureau’s division in Atlantic Canada.
Matthews said one emerging example was the use of A.I. to clone the voices of friends and family members in phony situations.
“[Scammers] are essentially taking these clips of people’s voices on social media,” said Matthews. “They’re using A.I. to manipulate these voices to make it seem like this person is calling you.”
The scammer then mimics the person’s voice using A.I. technology to make desperate pleas or aggressive demands for money.
Earlier this month, a mother in Arizona said she received a phone call and heard what she initially believed to be her daughter crying, followed by threatening ransom demands. The child wasn’t involved.
“These are known as emergency scams and prey on people’s willingness to send money to a friend or relative in need,” said Matthews, adding caller ID from these scam calls can sometimes appear as the impersonated contact, making it appear even more legitimate.
“We’ve gotten at least two [reports] a week for the last few weeks,” said Matthews. “We’re anticipating for this to become a larger scam. We would like for people to keep an eye out and know what the red flags are if they get this type of call.”
People should always be skeptical of any pressure from an individual or company to send money quickly and under urgent conditions.
“Just resist the urge to act immediately because scammers really want to instill this fear in you and get you to pay up right away,” said Matthews. “Sometimes you really need to take a step back and think logically. Maybe you want to check out this story with other friends and family members.”
This article was originally sourced from www.ctvnews.ca