Skip to Content

Deepfake porn is ruining women’s lives. Now the law may finally ban it.

After years of activists fighting to protect victims of image-based sexual violence, deepfakes are finally forcing lawmakers to pay attention.

conceptual illustration
Franziska Barczyk
February 12, 2021

Helen Mort couldn't believe what she was hearing. There were naked photos of her plastered on a porn site, an acquaintance told her. But never in her life had she taken or shared intimate photos. Surely there must be some mistake? When she finally mustered up the courage to look, she felt frightened and humiliated.

Mort, a poet and broadcaster in Sheffield, UK, was the victim of a fake pornography campaign. What shocked her most was that the images were based on photos, dated between 2017 and 2019, that had been taken from her private social media accounts, including a Facebook profile she’d deleted.

The perpetrator had uploaded these non-intimate images—holiday and pregnancy photos and even pictures of her as a teenager—and encouraged other users to edit her face into violent pornographic photos. While some were shoddily Photoshopped, others were chillingly realistic. When she began researching what had happened, she learned a new term: deepfakes, referring to media generated and manipulated by AI.

Helen Mort
Helen Mort
COURTESY PHOTO

“It really makes you feel powerless, like you’re being put in your place,” she says. “Punished for being a woman with a public voice of any kind. That’s the best way I can describe it. It’s saying, ‘Look: we can always do this to you.’”

The revelations would lead her on a frustrating quest for recourse. She called the police, but the officer said there was nothing they could do. She considered getting off the web entirely, but it’s crucial for her work.

She also had no idea who would have done this. She was terrified that it was someone she considered close. She began to doubt everyone, but most painfully, she began to doubt her ex-husband. They’re good friends, but the abuser had used his first name as a pseudonym. “It’s not him—absolutely not. But it’s really sad,” she says. “The fact that I was even thinking that was a sign of how you start doubting your whole reality.”

While deepfakes have received enormous attention for their potential political dangers, the vast majority of them are used to target women. Sensity AI, a research company that has tracked online deepfake videos since December of 2018, has consistently found that between 90% and 95% of them are nonconsensual porn. About 90% of that is nonconsensual porn of women. “This is a violence-against-women issue,” says Adam Dodge, the founder of EndTAB, a nonprofit that educates people about technology-enabled abuse.

In its consequences, this type of violation can be as devastating as revenge porn—real intimate photos released without consent. This takes a well-documented toll on victims. In some cases, they’ve had to change their names. In others, they’ve had to completely remove themselves from the internet. They constantly fear being retraumatized, because at any moment the images could resurface and once again ruin their lives.

Fortunately, parallel movements in the US and UK are gaining momentum to ban nonconsensual deepfake porn. The attention could also help ban other forms of image-based sexual violence, which have previously been neglected. After years of activists’ efforts to alert lawmakers to these egregious legal gaps, deepfakes are finally forcing them to pay attention.

“We’re just waiting for a big wave”

Deepfakes started with pornography. In December 2017, Samantha Cole, a reporter at Motherboard, discovered that a Reddit user with the screen name “deepfakes” was using techniques developed and open-sourced by AI researchers to swap female celebrities’ faces into porn videos. Cole tried to warn readers: other women would be next.

While the issue gained some public attention, it was mostly for the technology’s novelty. After all, fake celebrity porn had been around the internet for years. But for advocates who work closely with domestic violence victims, the development was immediate cause for alarm. “What a perfect tool for somebody seeking to exert power and control over a victim,” says Dodge.

It’s become far too easy to make deepfake nudes of any woman. Apps for this express purpose have emerged repeatedly even though they have quickly been banned: there was DeepNude in 2019, for example, and a Telegram bot in 2020. The underlying code for “stripping” the clothes off photos of women continues to exist in open-source repositories.

As a result, the scope of the abuse has grown: now targets are not just celebrities and Instagram influencers but private individuals, says Giorgio Patrini, Sensity’s CEO and chief scientist. In the case of the Telegram bot, Sensity found there had been at least 100,000 victims, including underage girls.

"What a perfect tool for somebody seeking to exert power and control over a victim."

Adam Dodge

Advocates also worry about popular deepfake apps that are made for seemingly harmless purposes like face-swapping. “It’s not a big leap of the imagination to go from ‘I can put my face onto a star’s face in a clip from a film’ to ‘I can put somebody else’s face on something pornographic,’” says Sophie Mortimer, who manages the UK nonprofit Revenge Porn Helpline.

In the context of the pandemic, this trend is even more worrying. Mortimer says the helpline’s caseload has nearly doubled since the start of lockdown. Existing abusive relationships have worsened, and digital abuse has seen an uptick as people have grown increasingly isolated and spent more time online.

While she’s only come across a few cases of Photoshopped revenge porn, she knows the arrival of their deepfake equivalents will only be a matter of time. “People have had more time to learn how to use some of this technology,” she says. “It’s like we’re holding our breath, and we’re just waiting for a big wave to crash.”

“80% have no idea what a deepfake is”

Today there are few legal options for victims of nonconsensual deepfake porn. In the US, 46 states have some ban on revenge porn, but only Virginia’s and California’s include faked and deepfaked media. In the UK, revenge porn is banned, but the law doesn’t encompass anything that’s been faked. Beyond that, no other country bans fake nonconsensual porn at a national level, says Karolina Mania, a legal scholar who has written about the issue.

This leaves only a smattering of existing civil and criminal laws that may apply in very specific situations. If a victim’s face is pulled from a copyrighted photo, it’s possible to use IP law. And if the victim can prove the perpetrator’s intent to harm, it’s possible to use harassment law. But gathering such evidence is often impossible, says Mania, leaving no legal remedies for the vast majority of cases.

This was true for Mort. The abuser, who hadn’t created the pornographic images personally and didn’t use Mort’s real name, had walked a careful line to avoid any actions deemed illegal under UK harassment law. The posts had also stopped a year before she learned about them. “Anything that might have made it possible to say this was targeted harassment meant to humiliate me, they just about avoided,” she says.

There are myriad reasons why such abuses fall through the cracks of existing law. For one, deepfakes are still not a well-known technology. Dodge regularly runs training sessions for judges, mental-health professionals, law enforcement officials, and educators, or anyone else who could encounter and support victims of nonconsensual porn. “Regardless of the audience,” he says, “I would say 80% have no idea what a deepfake is.”

For another, few victims have come forward, owing to the shame and harassment that can follow. Mort has already been trolled since sharing her experience publicly. “Speaking about this stuff opens the door for more abuse,” she says. “Also, every time you do it, you have to relive the thing over again.”

"Every time you do it, you have to relive the thing over again."

Helen Mort

Noelle Martin, who became an activist after discovering at 18 that she’d been victimized in a fake porn campaign, was subsequently targeted with a more elaborate deepfake porn campaign. The fact that faked and deepfake porn are inherently false also doesn’t quiet the volume of victim blaming.

This makes it challenging for politicians to understand the scope of the issue. Charlotte Laws, a longtime advocate who successfully passed legislation to ban revenge porn in California (the second state to do so), says victims’ stories are crucial to generating political will. When revenge porn was considered a non-issue, she’d bring files “two inches thick” with cases of victims who’d suffered tangible harm to their careers and personal lives, including her teenage daughter. When another teenager, Audrie Pott, killed herself in Northern California after nude pictures of her were posted without her consent, California legislators finally mobilized, setting off a wave of state laws across the country. “Those stories need to come out, because that’s what touches people,” Laws says. “That’s what makes people act.”

The technology is difficult to regulate, however, in part because there are many legitimate uses of deepfakes in entertainment, satire, and whistleblower protection. Already, previous deepfake bills introduced in the US Congress have received significant pushback for being too broad.

“It’s about reclaiming power”

Here’s the good news: the tide seems to be turning. The UK Law Commission, an academic body that reviews laws and recommends reforms when needed, is currently scrutinizing those related to online abuse. It plans to publish draft recommendations within the next few weeks for public consultation. Activists are hopeful this will finally expand the ban on revenge porn to include all forms of faked intimate images and videos. “I think it’s been a really thorough exercise,” says Mortimer, who has been consulting with the commission to share victims’ stories anonymously. “I’m cautiously optimistic.”

If the UK moves forward with the ban, it would become the first country to do so, greasing the wheels for the US to follow suit. The US and UK often mirror each other because they have a similar common law structure, says Mania. And if the US takes action, then the EU will likely do so too.

Of course, there will still be major hurdles. A key difference between the US and UK is the First Amendment: one of the biggest obstacles to passing a federal revenge porn ban is that it’s been perceived to infringe on freedom of speech, says Rebecca Delfino, a law professor at Loyola Marymount University. Charlotte Laws echoes this assessment. She has now worked with members of the US Congress to introduce a bill to ban revenge porn three times, but all those efforts petered out amid First Amendment concerns.

But deepfakes also represent an interesting legislative opportunity because lawmakers are so concerned about the technology’s capacity to interfere with elections. In 2019, Representative Yvette Clarke introduced the Deepfakes Accountability Act with this in mind. She bundled together punishments for election interference and recourse for individuals who suffer personal harms, like nonconsensual porn. The bill stalled, but she says she’s preparing to reintroduce a revised version within a few weeks. “The rapid adoption of technology, the use of social media, during this pandemic, makes the conditions ripe for actually passing some meaningful deepfake legislation,” she says.

Vice President Kamala Harris has also long been a champion of a federal ban on revenge porn, which could mobilize further support. “We’re in a new Congress,” Clarke says. “There are members in the Congress, both on the Senate and House side, who recognize what this threat is to our way of life, and how it has already been used to abuse women.”

As for Mort, she says seeing this momentum has made coming forward worth it. She’s now speaking with her local member of Parliament, sharing her experience, and helping map out what can be done. “I’m feeling part of a movement. That’s really important to me,” she says.

A few days after posting a petition on Change.org, she also posted a new video. She recited a poem she’d written, born from her trauma. It was cathartic, she says, to turn this ugliness into art: “It’s about reclaiming power.”

Keep Reading

Most Popular

How to opt out of Meta’s AI training

Your posts are a gold mine, especially as companies start to run out of AI training data.

How a simple circuit could offer an alternative to energy-intensive GPUs

The creative new approach could lead to more energy-efficient machine-learning hardware.

How gamification took over the world

Gamification was always just behaviorism dressed up in pixels and point systems. Why did we fall for it?

Apple is promising personalized AI in a private cloud. Here’s how that will work.

Apple’s first big salvo in the AI wars makes a bet that people will care about data privacy when automating tasks.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.