Scammers are always looking for new ways to defraud their victims, and with the increasing sophistication of technology, they now have more tools at their disposal than ever before. One of the latest techniques that scammers are using to carry out emergency scams is voice cloning, which uses artificial intelligence (AI) to replicate a person’s voice. Scammers only need a few seconds of a person’s voice in order to create a recording and carry out these AI-powered emergency scams.
How do scammers use voice cloning to carry out emergency scams?
These scams typically involve a fraudster impersonating a loved one in distress, urging the victim to send money immediately. Here’s how scammers use voice cloning to carry out AI-powered emergency scams:
Impersonating a loved one
To carry out the impersonation, scammers typically gather information about their target’s family members through social media or other online sources. They may purchase personal information from data brokers or hack into email accounts to gain access to contact lists and other sensitive data.
Once they have enough information, scammers can use voice cloning technology to create a convincing fake recording. They may use the recording to make it seem like the loved one is in distress. For example, they might claim that they have been in a car accident, have been kidnapped, or are in jail and need bail money. They may ask for money to be wired or transferred to a specific account to help them out of the situation.
Creating fake recordings
Scammers can use AI tools to create fake recordings of a loved one’s voice. The process of creating a voice clone from a video is known as voice synthesis, and it involves using artificial intelligence algorithms to analyze a person’s facial movements, lip movements, and other visual cues to generate a realistic-sounding voice.
Scammers can obtain a short video of a person’s face to create a voice clone by searching for publicly available videos on social media platforms such as TikTok or Instagram. Many voice synthesis tools are free or very low cost and require only a few seconds of video footage to generate a convincing voice clone. This means that scammers can create a fake recording of a person’s voice with very little effort.
Manipulating voice messages
Scammers can also use AI tools to manipulate existing voice messages from a loved one. They can change the tone of the message to make it seem like the person is in distress or needs urgent assistance, such as bail money or medical expenses. By manipulating the victim’s emotions, scammers can convince them to act quickly without taking the time to verify the legitimacy of the message.
How can you avoid these AI-powered emergency scams?
By staying informed and taking precautions, you can avoid these scams. Follow these tips to protect yourself against AI-powered emergency scams:
Use a family code word
Create a family code word that only you and your loved ones know. This word can be used in case of an emergency to verify that the person on the other end of the line is who they say they are. Make sure to keep the code word private amongst your family and friends.
Be cautious of unsolicited calls or messages
If you receive an unexpected call or message from someone claiming to be a loved one, be cautious. Take the time to verify the person’s identity before sending any money. We recommend that you hang up the phone and call the person directly to verify
Be careful with your personal information
Scammers can easily obtain a few seconds of your voice from videos you post to your social media accounts. Be careful with the content and information you share online and limit the amount of personal information you make public. Consider making your profiles private, so that only your friends and family can see your posts.
Key Takeaways
Voice cloning is a new technique that scammers are using to carry out AI-powered emergency scams. By being cautious and using a family code word, you can protect yourself from these scams. Remember, if you receive an unexpected call or message asking for money, take the time to verify the person’s identity before sending any money.