Deepfake Audio Nabs $35M in Corporate Heist
A combination of business email compromise and deepfake audio led a branch manager to transfer millions to scammers, in a case that serves as a warning to organizations.
October 20, 2021
A group of fraudsters made off with $35 million after using forged email messages and deepfake audio to convince an employee of a United Arab Emirates company that a director requested the money as part of an acquisition of another organization, according to a US federal court request filed last week.
The attack targeted a branch manager with emails that appeared to be from the director and a US-based lawyer, who the emails designated as coordinator of the acquisition. This attack is the latest to use synthetic audio created using machine-learning algorithms, known as neural networks, to mimic the voice of a person known to the targeted employee.
For that reason, deepfake audio and synthesized voices will likely become part of cybercriminals' techniques in the future. A variety of open source tools are available to allow anyone to create deepfakes, both video and audio, says Etay Maor, senior director security strategy at network security firm Cato Networks.
"If there is money to be made, you can be sure that attackers will adopt new techniques," Maor says. "It's not super-sophisticated to use such tools. When it comes to a voice, it is even easier."
The corporate heist is the second known attack using deepfake technology. In 2019, a manager of a UK subsidiary of a German company received a call from what sounded like his Germany-based CEO, who he had previously met. At the fake CEO's request, he transferred €220,000 to a supposed vendor. The manager did not become suspicious until the same person posing as the CEO called again two days later, asking for another €100,000. He then noticed that the phone number came from Austria, not Germany.
The success of these attacks goes back to trust, says Maor. A call from someone you know asking for money is different than an email claiming to be a Nigerian prince. An employee talking to a person, who they believe is their CEO, will be more likely to transfer money.
The solution for most companies will have to go back to "never trust, always verify," he says.
"We are going to have to adopt some of the principles of zero trust into this world of relationships," he says. "It does not have to be a technological solution. A process of verifying may be enough."
The US Department of Justice filing has few details of the United Arab Emirates investigation. A US-based lawyer allegedly had been designated to oversee the acquisition, and the Emirati investigation tracked two transfers totaling $415,000 deposited into accounts at Centennial Bank in the United States.
"In January 2020, funds were transferred from the Victim Company to several bank accounts in other countries in a complex scheme involving at least 17 known and unknown defendants," stated the request to the US District Court for the District of Columbia. "Emirati authorities traced the movement of the money through numerous accounts and identified two transactions to the United States."
The request asked the courts to designate a DoJ lawyer to be the point of contact in the US for the investigation.
While the technology to create realistic fake audio and video of people using generative adversarial neural networks (GANs) has fueled fears of deepfakes wreaking havoc in political campaigns, and of wrongdoers claiming actual evidence was created by deep neural-network technology, so far most examples have been proof of concepts, outside of an underground market for fake celebrity pornography and revenge pornography.
Yet the technical requirements are no longer a hurdle for anyone who wants to create deepfakes. Maor estimates it takes less than five minutes of sampled audio to create a convincing synthesized voice, but other estimates put the necessary raw audio at two to three hours of samples. Lesser quality synthesis takes a lot less time. For many business executives, attackers can pull the necessary audio from the Internet.
Companies do not need special technology to defeat deepfake-fueled business process compromises. Instead, they need to add verification steps to their accounting processes, says Maor.
"If you have the right processes in place, you can weed out these issues," he says. "At the end of the day, a simple phone call to verify the request could have prevented this."
About the Author
You May Also Like
Cybersecurity Day: How to Automate Security Analytics with AI and ML
Dec 17, 2024The Dirt on ROT Data
Dec 18, 2024