Lifestyles

Artificial Intelligence Giving Old Scam Dangerous Twist

Elliott Greenblott
AARP
This scam has been in circulation for years: “Hello, Grampa? It’s me Melissa (or Jimmy or Amanda).” It can take a number of different paths: “I’m in jail and need money” or “I was in a terrible accident and I’m in the hospital” or “I’ve been kidnapped and they want money.” While it goes by the name “Grandparent Scam” you can substitute best friend, family member or practically any other name for the supposed victim. This is a version of the imposter scam and it’s growing in frequency and sophistication.

In the past, criminals had difficulty replicating the supposed victim’s voice but were clever enough to make their phone calls at strategic times of day such as mealtime or bedtime when people are distracted or tired. Often the voice of the caller was muffled or unclear but the message was urgent and required immediate action.

Let’s turn the clock forward to 2023 and focus on how things are different. Previously, the criminal relied purely on the emotions of the victim and random information gathering but today the new tool for the criminal is Artificial Intelligence (AI). A basic definition of AI: computer systems able to perform tasks that normally require human intelligence, such as visual perception, speech recognition, decision-making, and translation between languages. In terms of fraud, AI can search the internet in a matter of minutes and collect all information, including social media posts, photographs, voice sound tracks, and videos from places such as YouTube and TikTok. AI can collect information from databases and from persons posting content online. The concern related to AI scams is so great that both the Federal Trade Commission and the FBI issued warning that “Artificial Intelligence is no longer a far-fetched idea out of a sci-fi movie. We’re living with it, here and now.”

AI scams utilize computers that search readily available information. To understand the process, try searching your own name by entering it in the search line (URL) of your device browser. (Note: the more a person is publicly active, the more information available). For a criminal, it means access to everything you uncover as well as anything posted in social media, video sharing and online databases.

The potential uses have staggering implications. An audio clip of less than 10 seconds can reveal enough of a person’s voice pattern to synthesize their voice. This makes personal uploads to social media and video sharing programs such as YouTube and TikTok valuable to criminals. Now the voice and image of a “kidnap” victim can be replicated for use in a scam. Piecing together profiles from online data can provide criminals with a wealth of personal information to commit scams. Online digital photographic and video images can be used to create a synthetic visual for committing extortion. This is exactly what happened to at least 8 senior citizens according to law enforcement officials in Canada. Over $200,000 was stolen in 3 days by a criminal using AI-generated voices from social media. AI adds a new dimension to the phrase “seeing (or hearing) is believing,” especially when considering the creation of fake news and social media posts.

Addressing AI scams is a “catch-up” game. We can only guess how AI will be used by criminals. Legitimate software companies, security systems, and government agencies are working aggressively to combat the threats. For now, there are steps you can take for self-defense. Update your device operating system and any malware software you use to the most recent version. With audio messages, listen for abnormal voice tones and patterns or unclear speech. Criminals will tug at emotions and use urgency to achieve their goals. Slow down. Tell the person you will call back, then use the time to verify the caller’s number. Look for inconsistent details with photos or videos such as incomplete or blurred images; for example, does the person cast a shadow.

If you sense a scam, report it to the FTC (reportfraud.ftc.gov), the FBI (IC3.gov) or your state Attorney General.

— Elliott Greenblott is a retired educator and coordinator of the AARP Vermont Fraud Watch Network. He hosts a CATV program, Mr. Scammer, distributed by GNAT-TV in Sunderland, VT – www.gnat-tv.org.

Avatar photo

As your daily newspaper, we are committed to providing you with important local news coverage for Sullivan County and the surrounding areas.