Dear Taylor Swift, we’re sorry about those explicit deepfakes
You have a platform and the power to convince lawmakers across the board that rules to combat these sorts of deepfakes are a necessity.
This story originally appeared in The Algorithm, our weekly newsletter on AI. To get stories like this in your inbox first, sign up here.
Hi, Taylor.
I can only imagine how you must be feeling after sexually explicit deepfake videos of you went viral on X. Disgusted. Distressed, perhaps. Humiliated, even.
I’m really sorry this is happening to you. Nobody deserves to have their image exploited like that. But if you aren’t already, I’m asking you to be furious.
Furious that this is happening to you and so many other women and marginalized people around the world. Furious that our current laws are woefully inept at protecting us from violations like this. Furious that men (because let’s face it, it’s mostly men doing this) can violate us in such an intimate way and walk away unscathed and unidentified. Furious that the companies that enable this material to be created and shared widely face no consequences either, and can profit off such a horrendous use of their technology.
Deepfake porn has been around for years, but its latest incarnation is its worst one yet. Generative AI has made it ridiculously easy and cheap to create realistic deepfakes. And nearly all deepfakes are made for porn. Only one image plucked off social media is enough to generate something passable. Anyone who has ever posted or had a photo published of them online is a sitting duck.
First, the bad news. At the moment, we have no good ways to fight this. I just published a story looking at three ways we can combat nonconsensual deepfake porn, which include watermarks and data-poisoning tools. But the reality is that there is no neat technical fix for this problem. The fixes we do have are still experimental and haven’t been adopted widely by the tech sector, which limits their power.
The tech sector has thus far been unwilling or unmotivated to make changes that would prevent such material from being created with their tools or shared on their platforms. That is why we need regulation.
People with power, like yourself, can fight with money and lawyers. But low-income women, women of color, women fleeing abusive partners, women journalists, and even children are all seeing their likeness stolen and pornified, with no way to seek justice or support. Any one of your fans could be hurt by this development.
The good news is that the fact that this happened to you means politicians in the US are listening. You have a rare opportunity, and momentum, to push through real, actionable change.
I know you fight for what is right and aren’t afraid to speak up when you see injustice. There will be intense lobbying against any rules that would affect tech companies. But you have a platform and the power to convince lawmakers across the board that rules to combat these sorts of deepfakes are a necessity. Tech companies and politicians need to know that the days of dithering are over. The people creating these deepfakes need to be held accountable.
You once caused an actual earthquake. Winning the fight against nonconsensual deepfakes would have an even more earth-shaking impact.
Deep Dive
Artificial intelligence
How to opt out of Meta’s AI training
Your posts are a gold mine, especially as companies start to run out of AI training data.
Apple is promising personalized AI in a private cloud. Here’s how that will work.
Apple’s first big salvo in the AI wars makes a bet that people will care about data privacy when automating tasks.
This AI-powered “black box” could make surgery safer
A new smart monitoring system could help doctors avoid mistakes—but it’s also alarming some surgeons and leading to sabotage.
An AI startup made a hyperrealistic deepfake of me that’s so good it’s scary
Synthesia's new technology is impressive but raises big questions about a world where we increasingly can’t tell what’s real.
Stay connected
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.