ADVERTISEMENT
ADVERTISEMENT

There’s Not Much We Can Legally Do About Deepfake Porn — Yet

Photo: Getty Images.
Content warning: This article includes references to nonconsensual image abuse. 
When streamer Stephanie Peloza hopped onto Twitter last week, she wasn’t surprised to see the news about a Twitch streamer caught up in deepfake pornography scandal.  That’s because she isn’t surprised by much on the internet anymore. “Basic respect and the basic rules we follow when we meet people in person, that stuff kind of goes out the window in these digital spaces,” Paidia’s head narrative designer and former YouTuber, who goes by the name Stef Sanjati online, tells Refinery29. Whether it’s women being verbally harassed while playing video games, or viewers commenting and criticizing streamers’ physical appearances, women seem to always be a target when they log on to spaces that haven’t historically been welcoming to them. And now, they're having fake pornographic images of themselves created and blasted across the internet.
AdvertisementADVERTISEMENT
On January 26, Brandon Ewing (who goes by the name Atrioc online) accidentally revealed during a livestream that he had paid to subscribe to a website that featured deepfake pornography images of top women streamers, including Pokimane, Maya Higa, SweetAnita, and QTCinderella, some of whom were his colleagues and friends. (While Ewing claimed in a chat comment on his apology stream that he wasn’t watching videos of any streamers he knows, according to Motherboard, his involvement with the site initially drew much more attention to the site and spread the breadth of the images further across the internet). Atrioc has since apologized in a tearful (and slightly bizarre) video and announced he’s stepping away from content creation and OFFBRAND, the content studio he cofounded with QTCinderella’s boyfriend Ludwig. [Refinery29 has reached out to Ewing for comment.] Motherboard reported that the images have been deleted by the creator, who has also since apologized. “The best course of action I have understood is to just wipe my part off the internet and help decrease the number of future videos of those involved. You will not see me pop up again,” the deepfake creator said. [OFFBRAND has not commented or released a statement on Ewing's departure; Ewing's photo is no longer on the creative studio's website.]
Deepfake porn can have a devastating and real-world impact on those affected. In response to the images, several of the streamers affected emphasized the unnecessary and nonconsensual sexualization of their likeness — and women online as a whole. “This is what it looks like to feel violated,” a visibly upset QTCinderella said during a January 31 Twitch stream, “This is what it looks like to feel violated. This is what it looks like to feel taken advantage of. This is what it looks like to see yourself naked against your will being spread all over the internet.” In a statement posted to her Twitter feed, Higa compared the incident to her 2018 sexual assault. “If anyone doesn’t think it’s a big deal that MY NAME is in headlines where thousands of people are commenting on the sexualization of MY BODY against MY WILL, you are the problem. This situation makes me feel disgusting, vulnerable, nauseous, and violated — and all of these feelings are far too familiar to me.” [Refinery29 has reached out to Higa, Pokimane, QTCinderella, and SweetAnita for comment. Higa referred us to her initial statement.]
AdvertisementADVERTISEMENT

“[People need to] understand that this does affect people's lives in a very direct way. [The internet is] not just an imaginary playground."

Stephanie Peloza
If this is your first time hearing about deepfake porn, you’re probably not alone. Deepfake pornography is when computer technology is used to map the face of one individual onto a sexually explicit video or image. (TL;DR: It can mean taking someone’s face and putting it on a digital body that isn’t theirs, creating a fake image that looks very, very real.) Since the term originated on Reddit in 2017, deepfake technology has been used for things as innocuous as reimagining the OG Back to the Future with Tom Holland, to having extremely serious implications like spreading political misinformation. But the vast majority of the technology’s usage has been to create sexually explicit and harmful images that victimize women. In 2019, AI firm Sensity found that 96% of deepfakes are pornographic and specifically target women.  
For both Peloza and streamer Caroline Kwan, one of the biggest issues lies around people’s inability to understand the line of consent when it comes to online forums and figures, something exacerbated by the disconnect many people seem to have when considering online public figures, especially women, as real people. “Female streamers, especially the top female streamers, tend to be the biggest streamers on the platform. So people look at them in the same way that people regard deepfakes of A-list celebrities: as if they're not real people,” Kwan says. In the days following Atrioc’s stream, many (*ahem* male) streamers and viewers online minimized the impact of the deepfakes, belittling the women’s experiences, and some going so far as to make reaction videos laughing at the women as they tearfully shared their experiences.
AdvertisementADVERTISEMENT
“[Viewers] almost think: Oh, well when you're that famous that comes with the territory, especially when you're a woman. And who cares? You're rich, you're famous, who gives a shit?,” Kwan says. 
“[People need to] understand that this does affect people's lives in a very direct way,” Peloza adds. “[The internet is] not just an imaginary playground. People make their livelihoods here, people socialize, they build culture, they work. … What you're doing may not be in the physical, real world, but it is still a real action that you're taking.”
And the scary thing is that as the AI technology behind deepfakes advances and becomes more accessible, there’s been a trickle down of deepfake porn of A-list celebs, streamers, and TikTok stars, to any women anywhere. In October 2020, further research from Sensity found that a new AI bot created and shared fake nude images of more than 680,000 women without their knowledge, often requiring only one photo of the person to generate the fake image. “This is so concerning,” Kwan says of the evolution. “This is something that is affecting women, whether you are famous in Hollywood, or whether you are famous or not even that famous online, or if you're just somebody who is a woman existing in this world.”
Watch Caroline Ford and Stephanie Peloza chat more on the legalities surrounding deepfake porn and the impact it has on women. Refinery29 Twitch streams Thursdays at 2 p.m. PT/ 5 p.m. ET.
AdvertisementADVERTISEMENT

There Are Laws Against Deepfake Porn, But Not Everywhere

So, what exactly can be legally done about deepfake pornography? Unfortunately not all that much right now.
“Part of the problem is that the technology is developing more quickly than the law can catch up to,” Caroline Ford, an Ohio-based attorney at Minc Law, who specializes in defamation removal law, told Refinery29. “We're really only now at a place where we have good laws in the books in a lot of places to address nonconsensual pornography [like revenge porn], but it took us 10 years, 20 years, 15 years to get here. … The law is just too slow. As soon as we catch up to one problem, another one emerges that we have to deal with, that we didn't think about before or didn't even contemplate when the original laws were written.” 
While Ford and her colleagues, who have represented clients in revenge porn cases, haven’t yet handled a deepfake pornography case, it’s an area — and growing area of concern — she says they’re keeping their eyes on. As it stands, there are laws in the United States specifically targeting deepfake pornography. In 2019, California Governor Gavin Newsom signed two bills targeting deepfakes, one of which specifically allows residents to sue anyone who uses deepfake tech to put their image in pornographic material. Virginia also has laws specifically targeting deepfake pornography.
But despite some states taking steps forward, there is no federal law tackling deepfake porn, which means the ability to bring criminal or civil charges against an individual differs between states and certain illegal conduct in one state may not be illegal in another. 
AdvertisementADVERTISEMENT
And that just makes things incredibly complicated, especially given the breadth and reach of the internet. “While laws are delineated around state lines, the internet isn't,” says Honza Cervenka, a senior associate at McAllister Olivarius, who specializes in discrimination, harassment, and nonconsensual pornography. “If I'm in Virginia, I have access to pretty much the same content as if I'm in California or in North Dakota, never mind the Philippines, Japan, or Ireland.” The real issue, Cervenka says, is that the person creating the nonconsensual images might be out of state — or even outside the country — from those they’re targeting. “That suddenly becomes an almost impossible situation for a victim to get justice because a police force in a small town in California is going to have a hard, if not impossible, time trying to bring somebody who's in Ireland to justice over a crime such as this.”

“This is something that is affecting women, whether you are famous in Hollywood, or whether you are famous or not even that famous online, or if you're just somebody who is a woman existing in this world.”

Caroline Kwan
Both Cervenka and Ford agree revenge porn and deepfake porn are in a similar vein as nonconsensual, image-based abuse, but the circumstances of deepfakes make it so that cases often fall in a gray area in many states where revenge porn is illegal. For example, to be legally considered image-based sexual abuse, an image needs to depict the actual breast or genitals of the person, which often isn’t the case when it comes to deepfakes, Cervenka and Ford note. Regardless of the nature of the act and intention behind the deepfake, the law doesn’t recognize that the person’s body is being shown. “If you take my face and superimpose it on somebody else's body, the totality of the image appears as if the body were mine, but it's not,” Cervenka says, “It sort of falls through the cracks of many of the laws that were written with the original revenge pornography, rather than this more sophisticated deepfake imagery, where a lot of the assumptions behind the legislation are unhelpful to victims."
AdvertisementADVERTISEMENT
Which isn’t to say that all hope is lost. For states that don’t have specific laws around deepfake pornography, there are other avenues people can pursue. Ford says those affected may be able to bring civil claims like misappropriation or copyright infringement (if a photo that was taken was copyrighted) forward. “You kind of have to get creative with those claims as it relates to deepfake pornography, but it's not impossible,” she says. “It's just that not a lot of the state laws have caught up and made it very clear cut what the cause of action is.” Despite this, both Ford and Cervenka are optimistic our laws *can* eventually catch up to the ever-evolving internet. It may just take time. 

It Helps To Have People Share Their Experiences

Back online, it’s important for people affected by deepfake pornography to share their experiences and speak to the issue (if they feel safe and comfortable to do so), because it keeps this very vital conversation going. “This situation with Atrioc and having this raging debate over fake porn is ultimately a good thing,” Kwan says. “Because it is drawing attention to this increasingly dangerous issue that women primarily are facing. And unfortunately, things don't happen unless people see real people being affected by it. So for women who are victims of this, sharing their stories is very powerful.” 
For anyone on the other side of the screen who might be unsure about where the line of consent is when it comes to the increasingly emotionally removed world of the internet, Peloza offers a point for reflection: “If a person has a question about whether or not they should do something with another person's image, they really should consider whether they could even ask the person that question,” she says. “And if they can't, I think that's a pretty clear answer.” 
AdvertisementADVERTISEMENT

More from Entertainment

ADVERTISEMENT