The latest shiny new trend on the technology front is Artificial Intelligence (AI). People are talking about using it everywhere, from writing to customer service. That also includes scammers who have devised a way to use AI to trick people out of their hard-earned cash.
It’s called deepfake.
Deepfake is synthetic media such as a photo, audio or video created by AI that has been digitally manipulated to replace a person's likeness with that of another. Fraudsters can use deepfake in a number of ways to make people believe they are getting a message from or are talking to a legitimate institution, a loved one or even a romantic interest.
"It's important to stay vigilant and understand what is out there to prevent ourselves from falling victim," said Scott Edwards, director of fraud risk management and financial crimes at BOK Financial®. "People are starting to be aware of phishing emails and texts and robocalls. Fraudsters see we're on to that, and they've moved on. Voice scams are the latest iteration of the scam revolution."
“Scammers only need three seconds of your voice to imitate it. It's really important to try to limit your biometrics that are out there.”- Scott Edwards, director, fraud risk management, financial crimes at BOK Financial
Here are four voice scams to be aware of, Edwards advises:
- Family member requests for money: Using deepfake voice technology, scammers can impersonate family members to call and say they're in trouble and need you to transfer them funds or disclose sensitive financial information.
What to do: Try to think clearly and ask yourself if the call feels legitimate. For instance, is the fraudster saying your family member is in Mexico, and you know they would never go to Mexico? Before responding, make an attempt to verify the situation by calling or texting directly with family members.
- Personal information theft: Deepfake voice technology can be used to gather sensitive data by pretending to be customer service representatives or conducting fraudulent surveys over the phone.
What to do: Avoid sharing personal or account details over the phone unless you called the institution yourself using a verified phone number.
- Phishing attacks: Scammers may leverage deepfake voices to create realistic voicemail messages instructing you to call back and provide confidential information or login credentials to your accounts.
What to do: Exercise caution before returning calls based solely on voicemail instructions. Verify the legitimacy of the request by calling the financial institution using the official contact information provided on the institution's website or mailed statements.
- Romance scams: Fraudsters can create a deepfake video face and voice matching the preferences on a victim's dating app profile. They will attempt to engage the victim in conversation and try to trick them into saying something incriminating to blackmail them or pull at their heartstrings and ask for money for a dire situation, such as not being able to pay rent or a loved one who needs surgery.
What to do: Stay aware of potential scams when engaging with someone you do not know. Be aware that the person on the other end of a phone call or video chat may not be what they appear or sound like. Never send money to someone you've only talked to through technology. If you plan to meet them in person to verify, do so in a public place.
Be mindful of your voice footprint
Your voice footprint is the quantifiable traits of your voice that can be measured using a mathematical formula and can be matched with you and no one else. From a security standpoint, much like a fingerprint, the uniqueness of a voiceprint makes it a good way to identify you as you.
"Scammers only need three seconds of your voice to imitate it," Edwards said. "It's really important to try to limit your biometrics that are out there." Today, while engaging in social media through recorded videos and participating in podcasts is publicly encouraged, it's often difficult to know who's listening on the other end.
"Know who you're connected to on social media," cautioned Edwards. "Even voicemails can be used. Try to use generic language, so it can't be emulated. Understand that the more your biometrics are out there, the higher the risk of you becoming a victim."
If you think you're being scammed
If you get a call that you think may be a scam, Edwards reiterated the importance of not reacting impulsively. If a loved one says they're in trouble, check on the loved one yourself. If you get a call from an institution, call the institution yourself with a verified phone number to see if the information you're being given is valid.
“Always take a breath and ask yourself if this makes sense. Criminals are always trying to target people that are naive to potential scams and make them feel rushed, so they react without thinking to try to fix the problem.”- Scott Edwards, director, fraud risk management, financial crimes
Once you've verified it's fraudulent, report the scam to the FTC so they can create an awareness campaign. You can also report the incident to your financial institution, local law enforcement and, where applicable, to the FBI.
Scammers play a numbers game. They target multiple victims, hoping to find the ones who aren't aware of the current landscape of scam techniques. Your best protection is to stay informed, so you don't become one of them.
Learn more about BOK Financial's online security or call 844-517-3308 to report suspicious activity on BOK Financial-related accounts. The Cybersecurity and Infrastructure Security Agency also keeps an up-to-date list of current threats.