(Continued from previous page)
But the risks from deepfakes don't only extend to employees receiving them. As in the previously mentioned scenario, the possibility exists that a threat actor could create a deepfake of a corporate executive saying or doing something detrimental to the organization's success.
"Amateur deepfake videos have improved significantly in the past few years on commodity sub-$500 video-gaming GPU hardware," says Chris Clements, vice president of solutions architecture at Cerberus Sentinel. "Audio can be even more convincing."
He reminds us, though, that even amateur deepfake videos require significant training sets to be more realistic. Organizations with executives who are frequent public speakers can have a much higher risk profile in this scenario.
"If a large number of high-quality video and audio data of an executive does exist, say from giving multiple public talks, it can be used to create convincing deepfakes," Clements says.
There are a few "tells" of deepfake videos today, he says.
"These include a noticeable lack of blinking by the subject, as well as a smearing effect around the edges of the face or hair. Shadows looking 'off' are another common shortfall of current deepfake technology," Clements explains. "These tells are going to get harder to spot as the technology and compute power of the hardware improves, however."
Many examples of "nonconsensual images" in which celebrity faces have been used in pornographic videos have already surfaced. It's no stretch to believe that corporate figures could find themselves in other sorts of nonconsensual images that could be just as damaging, if perhaps less graphic.
The ultimate solution may be the application of techniques taken from cryptography, Fernick says.
"I think if we have robust ways of authenticating that given video stream or audio stream has come from a specific device or been uploaded a certain time by a certain user, that may be helpful in disambiguating some of this and in offering journalists and news organizations ways of ensuring some level of quality or robustness or integrity of the content that they would be sharing with viewers," she says.
And doing this sort of authentication at scale will almost certainly give rise to a new form of cloud service offering, Fernick suggests.
"If there's anything that we can learn from cryptography, we always say, 'Don't roll your own crypto,'" she says. "[Ultimately] it becomes a really complex technical question with a few layers. But I imagine that it's something that could be well-served within the open source community and perhaps by a company that scales it up."