ADVERTISEMENT
ADVERTISEMENT

The Deepfakes Of Taylor Swift Prove Yet Again How Laws Fail Women

Content warning: This article includes references to nonconsensual image abuse. 
This week, X (formerly known as Twitter) was inundated with disturbing AI-generated images of Taylor Swift. At the time of writing, the origins of the images aren’t entirely clear, but what is clear is that the creation of deepfake pornography is incredibly psychologically damaging, abusive and controlling of women. And that it’s time the law catches up with the times.
It’s not the first time something like this has happened, too. Only a few weeks ago, Xochitl Gomez, the reigning Dancing With The Stars champ who plays America Chavez in the Marvel universe, spoke out after discovering sexually explicit deepfake images of her on Twitter, which she tried to have taken down to no avail. She is just 17 years old. “This has nothing to do with me. And yet it’s on here with my face,” Gomez said on a podcast about the incident. “Why is it so hard to take down? That was my whole thought on it, was, ‘Why is this allowed?’”
AdvertisementADVERTISEMENT
The rise in deepfake technology is problematic, especially for women: One report from Sensity AI found that 96% of deepfakes were sexually explicit, while 99% of them feature women. While the internet and rise in technology has always felt particularly loaded for women with the rise of online sexual harassment or distribution of revenge porn, the deepfake arena seems like it’s been built on one thing — to rip women of their sexual and bodily autonomy.
While much of the deepfake discourse has focused on “fake news”, hoaxes, or cheating at school, the reality is that the unregulated use of the technology is incredibly concerning, especially for women. It’s essentially a virtual assault on women, who now realise that by even uploading a photo of themselves on Instagram, they might be subjected to an online attack where their face is the weapon.
NBC News found that nonconsensual deepfake pornographic images were among the first shown when searching for celebrity names and the term “deepfakes”. But it’s not just celebrities who are victims. There have been multiple reports of men creating AI-generated pornographic images of women they know in real life and, in some instances, boys circulating AI-generated images of their female classmates. It’s a new way to deploy gender-based and sexual violence — this time, from people’s own homes.
Most states have banned revenge porn and its distribution, but when it comes to AI-created pornographic images, people are seemingly on their own. There are no current federal laws that tackle deepfake porn, which means the ability to bring criminal or civil charges is extremely difficult, especially considering the breadth, reach, and anonymity of the internet. Despite the psychological and emotional effects the technology has on women, there is nothing that actually prohibits the act, and perpetrators are still free to generate — and distribute — whatever deepfakes they like.
AdvertisementADVERTISEMENT
X’s policies ban the sharing of “synthetic, manipulated or out-of-context media that may deceive or confuse people and lead to harm,” but the images of Swift were viewed millions of times before they were finally removed — and some might argue that they wouldn’t have removed them if it weren’t for one group.
While legislation and social media platform moderation have failed us, women have simultaneously proven that the only way they can protect themselves from deepfake video distribution is by uniting themselves. In the hours after the images began circulating on X, Swifties were quick to create their own campaign — #ProtectTaylorSwift — and flood the platform with positive images of the singer. It was here that they began a mass-reporting campaign — and only then were the images taken down. In a statement posted on its platform, X said it was “actively removing all identified images and taking appropriate actions against the accounts responsible for posting them... We’re committed to maintaining a safe and respectful environment for all users.”
The incident has sparked renewed calls for legislation banning the creation and distribution of deepfake porn, and reportedly Swift may consider legal action. But what’s clear from these images is that if one of the most powerful women in the world can be subjected to sexual harassment and abuse via deepfake technology, what hope do the rest of us have? Women’s faces can being manipulated in images and superimposed into porn videos, but they don’t have rights the way the laws stand now. Someone flashing their nipples on social media will be immediately taken down, but sadly, disgusting AI-created images with real women’s faces on them won’t be until real action takes place.
If you have experienced sexual violence and are in need of crisis support, please call the RAINN Sexual Assault Hotline at 1-800-656-HOPE (4673).
AdvertisementADVERTISEMENT

More from Entertainment

ADVERTISEMENT