Deepfakes Didn't Disrupt the Presidential Election, But What's Next?

(Graphic by Lee Ferran / Code and Dagger)

(Graphic by Lee Ferran / Code and Dagger)

In the months leading up to the 2020 presidential election, there was an acute fear among some technologically minded national security experts: that of a deepfake video tricking enough people to sway the election.

The specific definition of deepfakes has been evolving in recent years, but it essentially refers to videos that are made with the help of machine learning to make it appear, very convincingly, as if someone said or did something they did not.

Perhaps the most famous example was a public service announcement from Buzzfeed in which President Barack Obama -- really comedian/producer Jordan Peele -- warned about the dangers of the technology and how Americans need to stay vigilant about what they consume online. (In the end, Obama-not-Obama deadpans to the camera, “Thank you. Stay woke, b*****s.”)

Manipulated media is nothing new, of course, but the technology to make convincing deepfakes has become increasingly easy to obtain and use. The technology’s first, and still most urgent danger, is related to pornography and the exploitation of unwitting women and even children.

MORE: The Most Urgent Threat of Deepfakes Isn’t Politics. It’s Porn. (Vox)

But for those close observers of national and election security, deepfakes presented a particular concern. Nearly anyone could create a deepfake video of whatever candidate they opposed doing something outrageous, and at best the candidate would be able to deny what appeared to be video evidence – think about a fake version of Mitt Romney’s controversial “47 percent” remark in 2012.

“It doesn’t have to be massively complex… but it hits the crucial nerve,” Ben Nimmo, a disinformation expert, told ABC News in March 2019. “There are people out there who will want to believe it and people out there who will want to share it because it suits their political beliefs.”

So, in an election cycle saturated with mis- and disinformation, and the president more than willing to traffic in either, the 2020 election seemed like a ripe target for some deepfake trickery.

But then… nothing really happened.

“I think the fear was present,” former CIA analyst Matthew Ferraro told Code and Dagger, “and I think to many people’s relief that didn’t happen.”

Ferraro, now an attorney at the D.C.-based firm WilmerHale who specializes in national and cybersecurity, noted that there were minor instances of deepfake-like videos, including one retweeted by the president in which now-President-Elect Joe Biden’s tongue appeared to be wagging out of his mouth and others that were clearly labeled as deepfakes as other PSAs, but certainly nothing coming close to the worst-case fears harbored before the election.

A recent report from WIRED suggested that maybe deepfakes weren’t necessary to spread disinformation, since plenty of it was getting around anyway. Or that deepfakes still aren’t quite convincing enough to be effective.

The threat isn’t going anywhere, however, and Ferraro said the government has a lot of catching up to do when it comes to the use of deepfakes in pornography and child exploitation.

But in the national security space, he said recent legislation and developments in deepfake-identifying technology by major companies might have put the U.S. government in an unusual place: a little ahead of the problem.

Most notably, deepfake legislation was included in this year’s defense spending bill, the NDAA, the second time that’s been done.

The legislation calls for the director of national intelligence to include in reporting about foreign interference in U.S. elections any information about “foreign malign influence campaigns, including machine-manipulated media…”

Like last year, this year’s legislation requires the Secretary of Homeland Security to write annual reports on the state of deepfake technology.

Ferraro said this year’s broadens definitions and the areas in which the government will investigate – including uses of deepfakes that could disrupt businesses or violate civil rights.

Here, he said, when the bill discusses deepfakes’ the ability to “harm, harass, coerce, or silence vulnerable groups or individuals,” the federal government could be making up ground on the non-consensual pornography issue – in a defense spending bill.

“Those provisions, very clearly, to me are meant to address the scourge of non-consensual deepfake pornography,” he said.

The bill also discusses a surprising flip side to the deepfake threat: deepfakes for the greater good.

“We usually don’t talk about those, but they exist, you know,” he said, suggesting deepfake audio could be used to artificially give someone who has lost their voice their old voice back. Or the time machine learning technology was used to make soccer star David Beckham “speak” nine languages in an anti-malaria campaign.

Overall, Ferraro said that while the danger of deepfakes is hardly receded – and continues to be an urgent problem in the realm of pornography, where laws are struggling to catch up – the presence of deepfake legislation in the NDAA and several other federal bills, as well as in many state legislatures, means lawmakers appear to be aware of a technologically advanced problem.

The reports required by the NDAA, for example, could become the basis for more specific, aggressive legislation.

“We often criticize the government for being ossified and for moving slowly. I think that one could look a the past 18 months of lawmaking in this space and be quite satisfied that the government is moving with some degree of alacrity to address an emerging threat,” he said. “And I do think they’re doing the right thing.”

Now, it only remains to be seen if the NDAA won’t be vetoed by President Donald Trump.

[Like what you see and read on Code and DaggerBecome a Patreon and help keep the lights on. Do you have a tip or question? Reach out at CodeAndDagger@protonmail.com.]

How Kremlin 'Curators' Could Explain Russia’s 2018 Disaster in Syria

After COVID Vaccine News, INTERPOL Warns of 'Onslaught' of Criminal Activity