By Content Writer–February 10, 2026–4 min read
Artificial intelligence (AI) is transforming the way we live—and unfortunately, the way scammers operate. Criminals are using AI to make old scams look new, more convincing, and harder to spot. They can clone voices, create fake videos, and impersonate people you trust. The goal? Trick you into sending money or sharing personal information.
Deepfakes are AI-generated images, videos, or audio that mimic real people. Scammers use them to impersonate celebrities, government officials, or even your friends and family. They might:
Post fake videos of famous people promoting fishy investments.
Call you using a cloned voice that sounds like someone you know.
Create fake profiles with AI-generated photos.
Fraudsters use AI to make their scams more personal and believable. Here are the most common tactics:
Text-Based Scams: Scammers can use AI to create personalised phishing emails, fake social media messages, or even pretend to be someone else in a chat.
Voice Cloning: Also known as "deepfake audio", voice cloning involves using AI to copy someone’s voice. Scammers use voice cloning in vishing (voice phishing) calls. For example, calling someone and pretending to be a relative in distress or a bank official.
Deepfake Videos and Images: Deepfake images are AI-generated images or videos that swap or mimic someone’s likeness. This can be a fake profile picture or a video of someone appearing to say things they never did.
Investment Scams: Scammers use AI to make fake videos of “financial experts” or celebrities telling people about a “can’t-miss” deal, often about crypto or stocks. They also use AI to spread false information to make people want to buy a certain stock. When lots of people buy it, the price goes up. Then, the scammers sell their shares and make money. After they stop talking about the stock, the price drops a lot, and other people lose money. This trick is called a “pump and dump” scheme.
Romance Scams: You might meet someone on a dating app or social media who seems charming, but the photos or even video chats of them may be deepfakes. Scammers use AI face-swapping and voice effects to sound and look like real people. Using fake accounts, they build an online relationship over weeks or months. Eventually, they will ask for money or personal financial information.
Emergency Scams: Scammers can use AI to imitate the voice or video of a family member. Sometimes called the “grandparent scam,” the fraudster will call you using a voice clone of a close family member or friend and claim they’re in an emergency and need money immediately.
Celebrity Endorsement (Ad) Scams: If you see a video of a celebrity promoting a product or service that you’ve never heard them mention before, be extra careful. Scammers can make fake endorsement posts with the image and likeness of a famous person. Scam artists also use government officials in their deepfake videos to convince victims of investment opportunities or to convince them to give out sensitive information.
Identity Theft: AI deepfakes are making it easier for criminals to steal people’s identities. Thieves can use stolen personal information, along with fake photos made by AI, to pretend to be someone else and try to get access to their money. Deepfake technology can trick security systems that use facial recognition or video chats, showing a fake but very realistic face. Scammers can also record someone’s voice and use AI to copy it, letting them get past voice security checks or answer security questions as if they are that person.
AI gives scammers tools to make their lies look real. A deepfake video of a celebrity or government official can make a fake investment seem legitimate. A cloned voice can make an emergency call sound urgent and believable. Scammers use all of these methods hoping that you will do what they say without taking time to think.
Be cautious with unexpected calls or messages: If something feels off, hang up and verify through another method. A family “code word” can help confirm identity.
Don’t overshare online: Limit personal details on social media. Scammers use this information to make it look or sound like they know you.
Practice cyber safety: Be sure of the links you click and the files you download from the internet.
Check registration before investing: In New Brunswick, trading platforms must be registered with the Commission, and advisors must be registered with the Canadian Securities Administrators (CSA).
Pause before reacting: Scammers rely on urgency. Take a moment to verify any request for money or sensitive information.
Stay informed: Learn about the latest Common Scams.