In 2023, an employee of multinational design and engineering company Arup thought he was just doing what he was told. During a video call, the chief financial officer directed him to transfer $25 million to a specific account.
Except it wasn’t really the company’s CFO. Instead, it was a deepfake created by hackers to trick the Arup employee. Not only did the hackers spoof the CFO’s image and voice; they also created digital replicas of fellow employees to commit this fraud.
While this is an extreme example, what happened at Arup does show that spear phishing attacks—now enhanced by AI—is a turbocharged problem.
In 2024, forensics and identification verification company Regula found that half of all businesses have experienced fraud involving audio and video deepfakes. They also found that respondent businesses across industries and around the globe have lost nearly $450,000 each, on average, to deepfakes.
“I don’t think community banks are more or less affected than anyone else,” says Scott Anchin, vice president of operational risk and payments policy for ICBA, “but it’s certainly a growing concern.”
AI adds fuel to the fire
Spear phishing is when a hacker takes someone’s personal information—usually willingly shared on social media—and tailors an email directly to them. Unlike blanket scams sent to millions with the hope that someone will click, these are designed with a specific person in mind.
Through these attacks, hackers are most likely trying to collect someone’s sensitive information, says Kinny Chan, chief commercial officer of the identification fintech Trust Stamp. They might get an email that looks like it’s from a bank or a coworker with a “link that looks identical to the bank’s website,” he says, “but in reality, it’s just a site that is seeking to capture your username and password.”
Once a hacker has that information, they can take over an account. If it’s a customer, they can drain their funds. If it’s a bank employee, they could seed a range of attacks, including collecting important employee information or directing fellow employees to move money to places it shouldn’t go. They can also use that entry into a community bank’s internal systems to launch a ransomware attack, Chan adds.
While spear phishing itself isn’t new, AI is making the hacker’s job easier. “Now with AI, [fraudsters] don’t have to spend hours and hours of time trudging through social media posts to find out where your kids go to school, what times you do certain things, when you go on vacation,” says David Brauchler, technical director and head of AI and machine learning security at IT security company NCC Group. AI can do that, and do it quickly. “It can go through everything a person has posted online in seconds.”
Community banks may not seem like ripe targets, but that almost doesn’t matter anymore, Brauchler says, because AI has made attacks so much easier to do at a larger scale, making almost anyone a target “because hackers are operating at higher efficiency.”
Putting a stop to spear phishing
Quick Stat
50%
of all businesses have experienced fraud involving audio and video deepfakes.
Source: Regula
Education is one way community banks can help prevent spear phishing, says Anchin. That includes training employees on the threats that are prevalent right now and what they might look like in practice. Community banks should also inform customers about the potential attacks, disguised as communication from their bank, that might come their way.
“A lot of community banks are doing proactive outreach to educate [clients] about fraud and scams, which is a really positive step, because it spurs people to think critically when they’re approached with a request,” says Anchin.
Community banks should also be “prepared for the eventuality that something will slip through the cracks,” he says. He adds that they should have an incident response plan in place in case a phishing attack—or any other kind of breach—is successful.
Another way to curb the damage from a successful spear phishing attack is to have multiple checkpoints for big transactions, says Brauchler. In the case of the fraud at Arup, “that employee shouldn’t have had that power without going through several people.” Getting extra approvals for transferring funds might seem annoying, but it can also catch fraud as it’s being pushed through the system.
Community banks may also want to evaluate the risks of the biometrics that they use to identify customers, Brauchler says, especially voice recognition. “With five seconds of audio, [fraudsters] can make an identical clone of anyone in the world,” he says.
But all is not lost, Anchin adds. The sheer prevalence of scams like this means people are talking about them. In 2024, there was even a movie, “Thelma,” about a 93-year-old woman who gave a fraudster $10,000 because they pretended to be her grandson.
“Community banks are being very proactive about educating their customers about these scams,” says Anchin, “and that’s very positive.”