Table of Contents
Technology is transforming the way financial scams operate, making them more sophisticated, automated, and harder to detect. From deepfake impersonations to cryptocurrency fraud and tech support scams, bad actors are leaving no stone unturned and are leveraging every advanced tool at their disposal to manipulate victims and steal their assets.
This blog will look at how fraudsters are weaponizing artificial intelligence (AI), social engineering, and evolving digital tactics to exploit financial planning clients, and what can be done to combat these growing threats.
Helping Scammers Work Smarter, Not Harder
For starters, malicious actors are using artificial intelligence (AI) to enhance the scale, sophistication, and effectiveness of their scams. AI-powered chatbots and deepfake technology allow them to create extremely convincing phishing emails, impersonate executives in business email compromise (BEC) scams, and even generate realistic voice recordings or videos to deceive victims.
For example, deepfake audio has been used in CEO fraud, where scammers mimic the voice of high-ranking executives and instruct employees to wire large sums of money. The scam may proceed solely via telephone or may employ a hybrid approach where a deepfake voicemail instructs the employee to check their email for the full payment instructions.
AI also facilitates automated social engineering, in which machine learning models analyze vast amounts of data from social media and the dark web to fashion hyper-personalized scam messages.
These technologies are being used for cryptocurrency fraud. AI-generated trading bots promise guaranteed profits but operate as Ponzi schemes, while generative AI tools help malefactors create fake whitepapers and websites for illegitimate crypto projects.
Unfortunately, as AI technology evolves, attackers will only hone their tactics further, making scams more difficult to pinpoint and prevent.
Seeing Is No Longer Believing
John Wilson, Senior Fellow, Threat Research at Fortra, has personally seen an instance where a scammer left a deepfake voicemail impersonating his company’s CEO. When asked if email scammers were also using AI, Wilson noted:“When we consider email scams, there is plenty of evidence to suggest the scammers may be using AI; however, it’s difficult to be certain. For example, the scam messages we used to see were almost always in English. As generative AI became more commonplace, we witnessed a corresponding increase in scams in other languages.”
Your Paycheck Took a Detour
He says before 2024, payroll diversion scams, where a scammer poses as an employee and attempts to socially engineer an HR employee into modifying the real employee’s direct deposit account, would generally use awkward phrases and were clearly just a copy-and-paste from a template.
“For instance, a pre-2024 payroll diversion email might read: “I wish to update my bank information before the next payroll is processed. What details do you need?” Recently, we’ve started seeing greater variation in the message content; for example, just today, we saw the following: “I hope this message meets you well. I’m reaching out to let you know that I’ve recently changed banks, and I would like to request an update to my direct deposit information before the upcoming pay period is finalized.” Our conclusion is that, yes, scammers are starting to use AI.”
The Blockchain of Broken Dreams
Cryptocurrency scams have surged, too, likely due to the soaring popularity of digital assets exploiting the anonymity and decentralized nature of blockchain technology. Fraudsters use a slew of tactics, including Ponzi schemes, phishing attacks, and fake investment platforms, to deceive investors and steal their funds.
One common scam is called the “rug pull”, in which developers promote a new cryptocurrency project, attract substantial investments, and then suddenly abandon the project, draining all liquidity. A notorious example is the Squid Game token scam in 2021, which saw developers launch a cryptocurrency inspired by the popular Netflix show, only to vanish with over $3 million after preventing investors from selling their tokens.
If It’s Too Good to Be True
Another widespread crypto scam revolves around fake giveaways and celebrity impersonation on social media. Threat actors create fake posts, or hijack verified accounts to claim that public figures like Elon Musk or Vitalik Buterin are giving away free Bitcoin or Ethereum in exchange for a small upfront “verification” payment.
In 2020, a major Twitter breach led to the hacking of accounts belonging to Musk, Bill Gates, and Apple, promoting a fake giveaway that stole over $118,000 in Bitcoin within hours. Phishing attacks, where scammers create fake websites aping legitimate crypto exchanges or wallets to trick users into entering their credentials, are also popular. Once compromised, victims find their accounts drained, and recovery is practically impossible dueto the immutable nature of the blockchain.
A Ransom or Your Reputation
Wilson says cryptocurrency scams are increasingly common, and in addition to pig butchering scams, the industry is seeing a massive increase in blackmail scams where the attacker claims to have hacked the victim’s computer and will distribute compromising webcam video to all the victim’s contacts unless they pony up a cryptocurrency ransom.
“The messages include personal details about the victim, such as the victim’s home address or phone number. We recently analyzed a few thousand of these attacks and discovered that 14% of the cryptocurrency wallets had transactions on the blockchain. This suggests that scammers are having a high rate of success with blackmail scams.”
Tech Support or Trick Support?
Tech support scams are becoming rife as well. These exploit people by impersonating legitimate IT services to get their hands on personal data or demand fraudulent payments. Malicious actors often pose as representatives from well-known companies such as Microsoft, Apple, or antivirus providers, contacting victims via phone calls, pop-ups, or phishing emails.
One common tactic involves fake security alerts warning users of malware infections, prompting them to call a bogus helpline. In 2023, the FBI’s Internet Crime Complaint Center (IC3) reported some 37,560 complaints related to tech support fraud, with losses of $924,512,658.
An example is the Tech Support Refund scam, where bad actors claim to offer refunds for expired or dissatisfactory services, but their real intention is to trick victims into providing personal information or payment details to steal their money. Another rampant scheme is remote access fraud, where fraudsters convince users to install remote desktop software, giving them control over the victim’s device to steal sensitive information or deploy ransomware.
A Hybrid Email/Phone Approach
Tech support scams are still common, says Wilson. However, scammers have had to alter their tactics in response to industry efforts to curb these scams.
“Because most mobile carriers now provide warnings or even block scam calls outright, scammers are now using a hybrid email-phone approach. The scammer sends the victim a message that a subscription has been renewed for another year. The email messages include a phone number to call to cancel the subscription.”
He explains that using email as the initial lure provides three advantages to the scammer. First, because the victim is placing an outbound call, the scammer circumvents the mobile phone provider’s inbound scam warnings. Secondly, the attacker can reach millions of potential victims and is not limited by the number of outbound calls they can place. Finally, the victims who call have already been tricked by the original email message and are presumably more likely to fall for the rest of the scam.
A Scam Exposed Is a Scam Weakened
When it comes to reporting scams, while there is a centralized place to report scams – ic3.gov, the cruel joke is that the people who know about the site are usually quite savvy about online scams, while the people who are the most susceptible to scams have no idea of where to report them, says Wilson.
“No technology solution can prevent every one of these scams, so education is a key component of any large-scale defense. I would like to see popular television shows incorporate some education about scams into their plot lines. For example, a character could get caught up in a romance scam. Anyone who watched the episode would then know about romance scams and might be less likely to fall for one.”
Speaking of the steps he has personally taken to protect clients from technology-enabled scams, Wilson says: “I’ll give you an example from just this past weekend. A friend of mine sent me a screenshot of a phishing SMS she’d just received. I did a bit of analysis and was able to confirm that the phishing link led to a phishing site that had been registered earlier that day at a popular registrar, where I happen to have some excellent contacts. I packaged up the evidence and sent it to one of my contacts, who quickly knocked the site offline. The total time from when my friend received the SMS until the site was offline was 59 minutes.”
He says the reason this worked was because of interpersonal relationships. “My friend knew to forward the SMS to me. I knew how to analyze the link and gather the necessary evidence to send to my contact. My contact had the ability to act.”
Vigilance, Education, Proactive Security
Unfortunately, fraudsters will only continue to refine their tactics using AI and digital tools to carry out their evil deeds. Staying on top of these threats can’t happen without a mixture of vigilance, education, and proactive security measures.
While technology can and is being used to deceive, it can also be used to root out and prevent fraud—meaning awareness and collaboration are key weapons in the fight against financial scams.