ADVERTISEMENT
ADVERTISEMENT

Deepfake Technology Has Been Weaponised Against Women — & Taylor Swift Is The Latest Victim

Content warning: This article includes references to nonconsensual image abuse. 
X (formerly known as Twitter) has been inundated with disturbing AI-generated images of Taylor Swift. At the time of writing, the origins of the images aren't entirely clear, but what is clear is that they have likely come from a website that is notorious for publishing fake nude images of celebrities without their consent.
The sexually explicit deepfakes have portrayed Swift as nude and in sexual scenarios, with some depicting her nude in a football stadium, likely referencing her partner Travis Kelce's football team. "This is horrible, where did you find this so I can avoid more," one person wrote in a thread of the explicit images on X. "She should use these photos on her Instagram," another said. What these commenters fail to realise (or maybe they do and just don't care) is that the creation of deepfake pornography is incredibly psychologically damaging, abusive, and controlling of women.
AdvertisementADVERTISEMENT
It's not the first time something like this has happened, too. Only a few weeks ago, Xochitl Gomez, who plays America Chavez in the Marvel universe, spoke out after discovering sexually explicit deepfake images of her on Twitter, which she tried to have taken down to no avail. She is just 17 years old. "It wasn’t because I felt like it was invading my privacy, more just like it wasn’t a good look for me," Gomez said on a podcast about the incident. "This has nothing to do with me. And yet it’s on here with my face.”
The rise in deepfake technology is problematic, but especially for women. One report from Sensity AI found that 96% of deepfakes were sexually explicit, while 99% of them feature women. While the internet and rise in technology has always felt particularly loaded for women with the rise of online sexual harassment or distribution of 'revenge' porn, the deepfake arena seems like it's been built on one thing — to rip women of their sexual and bodily autonomy.
While much of the deepfake discourse has focused on 'fake news', hoaxes, or cheating at school, the reality is that the unregulated use of the technology is incredibly concerning, especially for women. It's essentially a virtual assault on women, who now realise that by even uploading a photo of themselves on Instagram, they might be subjected to an online attack where their face is the weapon.
In Australia, the term 'deepfake porn' is Googled an alarming 14,800 times a month, while one of the largest deepfake porn websites sees about 14 million hits a month worldwide. And it's not just celebrities who are victims. There have been multiple reports of men creating AI-generated pornographic images of women they know in real life and, in some instances, boys circulating AI-generated images of their female classmates. It's a new way to deploy gender-based and sexual violence — this time, from people's own homes.
AdvertisementADVERTISEMENT
While Australia does officially prohibit the distribution of 'revenge' porn — where individuals cannot distribute explicit or sexual photographs or videos of a person without their consent — when it comes to AI-created pornographic images of people, they're seemingly on their own. In 2023, the Albanese government briefly considered a ban on "high-risk" uses of artificial intelligence, but since then, no legislation has been put in place. Perpetrators are still free to generate — and distribute — whatever deepfakes they like. Yes, despite the psychological and emotional effects the technology has on women, there is nothing that actually prohibits the act.
Similarly, while X's policies ban the sharing of "synthetic, manipulated or out-of-context media that may deceive or confuse people and lead to harm," the images were viewed millions of times before they were finally removed — and some might argue that they wouldn't have removed them if it weren't for one group.
While legislation and social media platform moderation have failed us, women have simultaneously proven that the only way they can protect themselves from deepfake video distribution is by uniting. In the hours after the images began circulating on X, fans were quick to create their own campaign — #ProtectTaylorSwift. It was here that they began a mass-reporting campaign — and only then were the images taken down. It wasn't the government who removed the images. It wasn't even X. It was the fans sitting at home.
There is speculation that Swift may consider legal action, but what's clear from these images is that if one of the most powerful women in the world can be subjected to sexual harassment via deepfake technology, what hope do the rest of us have? Women's faces can being manipulated in images and superimposed into porn videos, but they don't have rights. Someone flashing their nipples on social media will be immediately taken down, but sadly, disgusting AI-created images with real women's faces on them won't be.
AdvertisementADVERTISEMENT
If you or anyone you know has experienced sexual or domestic violence and is in need of support, please call 1800RESPECT (1800 737 732), the National Sexual Assault Domestic Family Violence Service.
If something is going wrong online and you need help, head to esafety.gov.au for advice on what to do next and how to stay safe. Young people can also call Kids Helpline anytime on 1800 55 1800 for support.
Get Refinery29 Australia’s best stories delivered to your inbox each week. Sign up here.
AdvertisementADVERTISEMENT

More from Entertainment

ADVERTISEMENT