It started as a joke about a film star, but now it's mired in involuntary pornography and political machination – and it will forever change the way we consume media...
What's the problem?
In a world accustomed to Photoshopped images, the old phrase "the camera never lies" has long since lost its legitimacy with still photographs – but the same suspicion doesn't always apply to moving video. That could soon change, due to "deepfakes."
Deepfakes use machine learning – artificial intelligence, self-correcting but usually guided by humans – to superimpose images on to existing video, usually replacing or manipulating human faces. At its most benign, this can merely be amusing: one of its most popular early applications was to artificially insert Nicolas Cage – whose character in the 1997 movie Face/Off changed his identity via a facial transplant – into famous scenes from films including Indiana Jones and Forrest Gump.
However, the technology has been used for much darker purposes, usually covertly rather than overtly. By late 2017, several female celebrities had been faked into pornography. By early 2018, deepfake apps had appeared and the technology was suddenly available to a mass market.
"Nothing can stop someone from cutting and pasting my image or anyone else's on to a different body and making it look as eerily realistic as desired," said actress Scarlett Johansson, a frequent deepfake victim who says she is helpless to stem the tide. "Trying to protect yourself from the internet and its depravity is basically a lost cause ... The internet is a vast wormhole of darkness that eats itself."
Scarlett Johansson has suffered from deepfake porn (Credit: Evan Agostini/Invision/AP)
What's the worst that could happen?
Deepfakes have been used for "revenge porn" and blackmail – and as the technology improves, fakers can use your widely available digital footprints. One forum user wrote: "I made a pretty good vid of a girl I went to high school with using only 380 pics scraped from insta & FB," referring to Instagram and Facebook. Arwa Mahdawi has written in the UK's Guardian newspaper that deepfake pornographers are motivated by a desire "to control and humiliate women."
There have also been political applications. Again, some were obvious, if offensive – such as replacing Argentine president Mauricio Macri's face with Adolf Hitler's, or German chancellor Angela Merkel's face with that of US president Donald Trump. In April 2018, BuzzFeed CEO Jonah Peretti and actor Jordan Peele released a "Public Service Announcement" in which a digitally manipulated Barack Obama (actually voiced by Peele) appeared to say a string of offensive statements before warning about the dangers of deepfakes.
Not every political forgery will be as explicit – in either sense of the word. Now that it's relatively easy to put words in the mouths of leaders, from subtle misquotes to outrageous falsehoods, video footage can no longer be automatically regarded as trustworthy – yet, it is also easier than ever, via the echo chambers of social media, to spread a malicious misrepresentation.
Barack Obama starred in a deepfake Public Service Announcement (Credit: AP Photo/Pablo Martinez Monsivais)
What do the experts say?
"Right now, it's not so easy that anyone can create a really well done deepfake that's going to fool a lot of people," says Shuman Ghosemajumder, Shape Security's chief technology officer and former click-fraud czar at Google. But deepfakes can still be propagated by social-media bot accounts – between October 2018 and March 2019, Facebook removed more than three billion fake accounts and estimated that it still has 120 million fake active monthly users.
Ghosemajumder thinks deepfake technology will soon be added to social media platforms "for amusement, much the same way that Snapchat filters exist." In the long run, it will be "an AI vs AI arms race," Ghosemajumder says. "Once they reach a level of sophistication that they're going to be able to fool most human eyes, the only way to detect them is also going to be machine-learning based."
In the meantime, Ghosemajumder urges constant skepticism. "If a celebrity or a politician is in a video doing things that they claim not to have done, that they claim is a fake video, how do we actually verify whether or not their claim is true or if the video is true? I think that's something that we're still figuring out."