Bringing The World Home To You

© 2024 WUNC North Carolina Public Radio
120 Friday Center Dr
Chapel Hill, NC 27517
919.445.9150 | 800.962.9862
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

In An Era Of Fake News, Advancing Face-Swap Apps Blur More Lines

Most people familiar with "face-swapping" know it as an innocuous social media feature. An algorithm captures a person's face and pastes it onto someone else's. The result is rarely seamless and often funny.

But as it grows more sophisticated, that technology has taken a sinister turn: It's now become easier to superimpose the faces of celebrities onto those of actors in pornographic films, resulting in highly-realistic fake videos.

Deepfakes, as the digital form is called, takes its name from the Redditor "deepfakes," the first person known to create these fake porn videos. Celebrities Daisy Ridley, Gal Gadot and Taylor Swift are among deepfakes' early victims.

Samantha Cole, an editor at Motherboard, who first reported on the trend, tells NPR's Scott Simon that the videos are created using a machine-learning algorithm, which is trained by processing hundreds of photos of an individual's face.

"Someone takes a dataset of one person's face — and a lot of pictures of that person's face — and then a video that they want to put it on," Cole says. "And they run a machine-learning algorithm, train it on these two images, and after a few hours, gives you the result, which is, these very realistic, fake porn videos."

So, while well-photographed actors and actresses are easy targets, as the technology quickly becomes more advanced and accessible, not-so-famous faces are worried where they might show up online.

That's the talk of Reddit threads right now, Cole says, "Whether this can be done with people that they know or scraped from Facebook images or Instagram. It's definitely possible, if you have enough images of someone."

In fact, a new, user-friendly tool, FakeApp, democratizes the technology. FakeApp allows anyone to generate fake videos with their own datasets. Deepfakes enthusiasts have been inserting into movies the Internet's favorite face: Nicolas Cage.

Legally, though, this quickly-advancing technology has been outpacing the law. "It's all very hazy right now," Cole says. "Celebrities could sue for misappropriation of their images, like when you use a celebrity's face for an ad without their permission. But the average person has little recourse. Revenge porn laws don't include the right kind of language to cover this kind of situation."

Similar technologies have already stirred fears. Last year, journalist Nick Bilton considered the implications of adding two new manipulative mediums. He pointed to a video demonstrating technology researchers developed that allowed them to manipulate the facial expressions of world leaders, including Presidents Donald Trump and Vladimir Putin.

If fabricated text-based stories can snowball into events like PizzaGate, he suggested, he left the door open for the consequences that could play out on an international stage when bad actors latch onto this technology.

"Celebrities and porn performers are two groups of people that have lots of images of themselves publicly so they're easy targets for this, but so are politicians," Motherboard's Samantha Cole says. "It's going to be difficult trying to suss out all of this in an era of fake news."

Copyright 2021 NPR. To see more, visit https://www.npr.org.

Lawrence Wu
More Stories