by
If you believe reports in the news, impending deepfake disaster is headed our way in time for the 2020 United States election. Political intrigue, dubious clips, mischief and mayhem were all promised. We’ll need to be careful around clips of the President issuing statements about being at war, or politicians making defamatory statements. Everything is up for grabs, and in play, or at stake. Then, all of a sudden…it wasn’t.
Nothing happened. Nothing has continued to happen. Where did our politically charged deepfake mayhem go to? Could it still happen? Is there time? With all the increasingly surreal things happening on a daily basis, would anybody even care?
The answer is a cautious “no, they probably wouldn’t.” As we’ve mentioned previously, there are two main schools of thought on this. Shall we have a quick refresher?
Following the flow
Stance 1: Catastrophe and chaos rain down from the heavens. The missiles will launch. Extreme political shenanigans will cause skulduggery and intrigue of the highest order. Democracy as we know it is imperilled. None of us will emerge unscathed. Deepfakes imperil everything.
Stance 2: Deepfakes have jumped the shark. They’d have been effective political tools when nobody knew about them. They’re more useful for subversive influence campaigns off the beaten track. You have to put them in the places you least expect, because people quite literally expect them. They’re yesterday’s news.
Two fairly diverse stances, and most people seem to fall in one of the two camps. As far as the US election goes, what is the current state of play?
2020 US election: current state of play
Imagine our surprise when instead of deepfaked election chaos, we have a poorly distorted gif you can make on your phone. It’s heralded as the first strike of deepfakes “for electioneering purposes”.
It’s dreadful. Something you’d see in the comment section of a Myspace page, as pieces of face smear and warp this way and that. People are willing to call pretty much anything a deepfake to add weight to their points. The knock-on effect of this is overload and gradual disinterest due to hype. Things many would consider a deepfake are turned away at the door as a result of everything in sight being called a deepfake.
This is a frankly ludicrous situation. Even so, outside of the slightly tired clips we’ve already seen, there doesn’t appear to be any election inroad for scammers or those up to no good.
What happened to my US election deepfakes?
The short answer is people seem to be much more taken with pornographic possibilities than bringing down Governments. According to Sensity data, the US is the most heavily targeted nation for deepfake activity. That’s some 45.4%, versus the UK in second place with just 10.4%, South Korea with 9.1%, and India at 5.2%. The most popular targeted sector is entertainment with 63.9%, followed by fashion at 20.4%, and politics with a measly 4.5%.
We’ve seen very few (if any) political deepfakes aimed at South Korean politicians. For all intents and purposes, they don’t exist. What there is an incredible amount of, are pornographic fakes of South Korean K-Pop singers shared on forums and marketplaces. This probably explains South Korea’s appearance in third place overall and is absolutely contributing to the high entertainment sector rating.
Similarly adding to both US and entertainment tallies, are US actresses and singers. Again, most of those clips tend to be pornographic in nature. This isn’t a slow trickle of generated content. It’s no exaggeration to say that one single site will generate pages of new fakes per day, with even more in the private/paid-for sections on their forums.
This is awful news for the actresses and singers currently doomed to find themselves uploaded all over these sites without permission. Politicians, for the most part, get off lightly.
What are we left with?
Besides the half dozen or so clips from professional orgs saying “What if Trump/Obama/Johnson/Corbyn said THIS” with a clip of said politician saying it (and they’re not that great either), it’s basically amateur hour out there. There’s a reasonably consistent drip-feed of parody clips on YouTube, Facebook, and Twitter. It’s not Donald Trump declaring war on China. It isn’t Joe Biden announcing an urgent press briefing about Hilary Clinton’s emails. It’s not Alexandria Ocasio-Cortez telling voters to stay home because the local voting station has closed.
What it is, is Donald Trump and Joe Biden badly lip-syncing their way through Bohemian Rhapsody on YouTube. It’s Trump and Biden talking about a large spoon edited into the shot with voices provided by someone else. I was particularly taken by the Biden/Trump rap battle doing the rounds on Twitter.
As you may have guessed, I’m not massively impressed by what’s on offer so far. If nothing else, one of the best clips for entertainment purposes I’ve seen so far is from RT, the Russian state-controlled news network.
Big money, minimal returns?
Consider how much money RT must have available for media projects, and what they could theoretically sink into something they clearly want to make a big splash with. And yet, for all that…it’s some guy in a Donald Trump wig, with an incredibly obviously fake head pasted underneath it. The lips don’t really work, the face floats around the screen a bit, evidently not sharing the same frame of reference as the body. The voice, too, has a distinct whiff of fragments stitched together.
So, a convincing fake? Not at all. However, is that the actual aim? Is it deliberately bad, so they don’t run a theoretical risk of getting into trouble somehow? Or is this quite literally the best they can do?
If it is, to the RT team who put it together: I’m sorry. Please, don’t cry. I’m aiming for constructive criticism here.
They’re inside the walls
Curiously, instead of a wave of super-dubious deepfakes making you lose faith in the electoral system, we’ve ended up with…elected representatives slinging the fakes around instead.
By fakes, I don’t mean typical “cheapfakes”, or photoshops. I mean actual deepfakes.
Well, one deepfake. Just one.
“If our campaign can make a video like this, imagine what Putin is doing right now”
Bold words from Democratic candidate Phil Ehr, in relation to a deepfake his campaign team made showing Republican Matt Gaetz having a political change of heart. He wants to show how video and audio manipulation can influence elections and other important events.
Educating the public in electioneering shenanigans is certainly a worthwhile goal. Unfortunately, I have to highlight a few problems with the approach:
- People don’t watch things from start to finish. Whole articles go unread beyond the title and maybe the first paragraph. TV shows progress no further than the first ad break. People don’t watch ad breaks. It’s quite possible many people will get as far as Matt Gaetz saying how cool he thinks Barack Obama is, then abandon ship under the impression it was all genuine.
- “If we can make a video like this” implies what you’re about to see is an incredible work of art. It’s terrible. The synthetic Matt Gaetz looks like he wandered in off the set of a Playstation 3 game. The voice is better, but still betrayed by that halting, staccato lilt so common in audio fakery. One would hope the visuals being so bad would take care of 1), but people not really paying attention or with a TV on in the background are in for a world of badly digitised hurt.
An acceptable use of technology?
However you stack this one up, I think it’s broadly unhelpful to normalise fakes in this way during election cycles regardless of intention. Note there’s also no “WARNING: THIS IS FAKE” type message at the start of the clip. This is bad, considering you can detach media from Tweets and repurpose.
It’s the easiest thing in the world to copy the code for the video and paste it into your own Tweet minus his disclaimer. You could just as easily download it, edit out the part at the end which explains the purpose, and put it back on social media platforms. There’s so many ways you can get up to mischief with a clip like this it’s not even funny.
Bottom line: I don’t think this is a good idea.
Fakes in different realms
Other organisations have made politically-themed fakes to cement the theoretical problems posed by deepfakes during election time, and these ones are actually quite good. You can still see the traces of uncanny valley in there though, and we must once again ask: is it worth the effort? When major news cycles rotate around things as basic as conspiracy theories and manipulation, perhaps fake Putin isn’t the big problem here.
If you were in any doubt as to where the law enforcement action is on this subject: it’s currently pornography. Use of celebrity faces in deepfakes is now officially attracting the attention of the thin blue line. You can read more on deepfake threats (political or otherwise) in this presentation by expert Kelsey Farish.
Cleaning up the house
That isn’t to say things might not change. Depending on how fierce the US election battle is fought, strange deepfake things could still be afoot at the eleventh hour. Whether it makes any difference or not is another thing altogether, and if low-grade memes or conspiracy theories are enough to get the job done then that’s what people will continue to do.
Having said that: you can keep a watchful eye on possible foreign interference in the US election via this newly released attribution tracker. Malign interference campaigns will probably continue as the main driver of GAN generated imagery. Always be skeptical, regardless of suspicions over AI involvement. The truth is most definitely out there…it just might take a little longer to reach than usual.