ADVERTISEMENT
ADVERTISEMENT

There’s Not Much We Can Legally Do About Sexual Assault In The Metaverse

Photo: Getty Images.
Content warning: This piece contains references to sexual assault and harassment.
Last December, Nina Jane Patel was sexually assaulted after several men surrounded her, touching and groping her body. She asked them to stop repeatedly and tried to move away, but they followed her, continuing their verbal assault and sexual advances. “They were laughing, they were aggressive, they were relentless,” she tells Refinery29. “It all happened so fast." Before she could even think to put up a barrier, "I froze. It was surreal. It was a nightmare.”
After the assault, it wasn’t reported to the police, no suit was filed. She clearly saw her assailants’ faces, but wouldn’t be able to identify them in a lineup. Because Patel’s assault didn’t occur in the real world; it happened in the metaverse. Patel, a UK-based researcher whose doctorate looks at the psychological and physiological impact of the metaverse, was logged onto the Horizon Venues VR video game, which allows users to build and interact with different worlds, when her avatar, which she designed to physically look like her, was assaulted. Her online attackers were linked to men somewhere else in the real world (she could tell because the avatars had male voices). But the actual physical distance between Patel and her assailants didn’t matter. “They touched and groped my avatar while they took selfie-photos,” she says. As she started to remove herself from the situation by taking off her Oculus Quest 2 headset, she could still hear them. “[They were saying] ‘don’t pretend you didn’t love it,’ [and] ‘this is why you came here.’”
AdvertisementADVERTISEMENT
Patel’s encounter left her shaken up. “It took me a few days to process the event in its entirety,” she says. While her immediate inclination was to try and shrug it off as “weirdos on the internet” — a minimizing defense mechanism so many women have used after being harassed online — this was different than someone typing out “bitch” or ranting over comms about women in gaming. “I was immersed,” she says. “I had a sense of presence within the room. As my avatar, and I, entered this room and my avatar [was attacked], I was attacked.”  
Patel is not the only person who has experienced this kind of severe harassment online, with several other people logging onto virtual reality for a fun, interactive gaming experience reporting being verbally and physically assaulted by other players online (there are other reports of racial and homophobic slurs being used). These incidents — and the games and platforms’ response — are highlighting the difficulties that come with monitoring and prosecuting sexual assaults in an increasingly online world and how the line between reality and a digital space is becoming more blurred in potentially dangerous ways. Which, with the way the internet is going and the increasing and rapid growth of the VR space, could mean more people will be susceptible to these types of harms.

This was not a video game, and I certainly did not choose this.

Nina Jane Patel
Online harassment is far from a new MO, and the same goes for virtual sexual assault. In 1993, The Village Voice described one of the first reported accounts of a virtual rape in the game LambdaMOO when a player under the name Mr. Bungle used a subprogram that allowed them to describe sexual acts to the other players and forced other player’s characters to perform sex acts. A 2021 Pew Research Center study found that 41% of Americans have experienced some kind of online harassment, be it physical threats, sexual harassment, or name-calling, to name a few. (While the percentage is the same as in 2017, research indicates the intensity is greater). The same study found that the share of women who reported being sexually harassed online has doubled since 2017, jumping from 8% to 16%. What’s still unclear, however, is how many people have been sexually assaulted online, which makes Patel and others’ experiences unique as virtual reality’s intended design is to give users the feeling of real lived-in experiences.
AdvertisementADVERTISEMENT
“What is different about virtual reality is what we call immersion and presence,” says Dr. Brenda Wiederhold (PhD), a cyberpsychologist and co-founder of the Virtual Reality Medical Center in La Jolla, California. Weiderhold, who uses VR immersion to help clients overcome real-life trauma or fears — from aerophobia to arachnophobic. When put into a virtual world, “it becomes their world,” she says, meaning they fill in the pieces based on their own mind and experience. The immersion is a two-way experience, Weiderhold adds. “If we're inhabiting an avatar, it becomes who we are for that moment. I could be an 80-year-old man, and if I'm in an avatar that’s a 20-year-old woman's body, part of my brain [psychologically] becomes that person.” So if someone is sexually assaulted in virtual reality, the trauma can be carried into the real world as well. 

If you've had this happen to you in the metaverse, it doesn't end when you take off the headset.

Dr. Brenda Wiederhold, cyberpsychologisT
There’s a sensory aspect too. In May, a researcher from nonprofit organization SumOfUs reported being sexually assaulted while playing Meta’s Horizon Worlds VR. In the study, which documented the assault, the researcher said she was lured into a room by several male avatars, who prompted her to turn off her personal boundary setting, a control that prevents avatars from touching you and designed to help users avoid direct interactions in the metaverse. Once there, she was touched without her consent while they made lewd comments and passed around a bottle of alcohol in an interaction that was filmed. When her avatar was touched in game, she felt her handheld controllers vibrate, a gaming feature that’s meant to enhance user experience by drawing players into the world and making them feel more intertwined and embedded in it. (The researcher declined to comment for this story.)
AdvertisementADVERTISEMENT
“If you've had this happen to you in the metaverse, it doesn't end when you take off the headset,” Weiderhold says. In fact, according to Wiederhold, people who experience virtual sexual assault will most likely experience an increase in heart rate — a fight or flight response — just like they would if they’d encountered abuse IRL. “When you take that headset off, it's not like you can forget that. It's part of your memory now, and so you have to keep processing that.” While it may affect people differently — psychologically, physically, and even socially — some people may develop anxiety, have panic attacks, and experience depression, Weiderhold says.
There are many factors that play into what exactly makes the world of virtual reality vulnerable for this kind of behavior and treatment, and it’s not just relegated to Meta games, as the SumOfUs study said. Players have reported problems with sexual harassment, verbal abuse, and racial slurs in other VR games, like Rec Room, VRChat, and AltSpaceVR. Chief among the reasons is the toxic culture and long history of harassment in gaming, and the protection and emboldening that comes with internet anonymity. But the big issue is moderation. Or, more particularly, the lack of moderation. “It’s been pretty milk toast, honestly,” says Rewan Al-Haddad, a campaign director at SumOfUs. 
Meta is aware of the potential harms that can occur on their platforms. Last December, several advocacy groups along with Meta investors co-filed a motion that demanded Meta, formerly Facebook, publish a report that examines any potential civil and human rights harms users could face while in the metaverse. At a May shareholder meeting, this came up again when a proposal asked Meta to complete a third-party assessment of "potential psychological and civil and human rights harms to users that may be caused by the use and abuse of the platform" and "whether harms can be mitigated or avoided, or are unavoidable risks inherent in the technology." The proposal was voted down
AdvertisementADVERTISEMENT
In a May 18 blog post, Nick Clegg, the president for global affairs at Meta platforms, wrote that rules and safety features for the metaverse will be different and more proactive than those currently in place on social media platforms, but, in the same post, he continued: "In the physical world, as well as the internet, people shout and swear and do all kinds of unpleasant things that aren't prohibited by law, and they harass and attack people in ways that are. The metaverse will be no different. People who want to misuse technologies will always find ways to do it." In a statement to Refinery29, a Meta spokesperson said: “We want everyone to feel safe on our platforms and we don’t allow this behavior in Horizon Worlds. Personal Boundary is on by default at almost four feet for non-friends to make it easier to avoid unwanted interactions, and we don’t recommend turning off safety features for people you do not know. We want everyone using our products to have a good experience and easily find the tools that can help in situations like these, so we can investigate and take action.” 
But for Al-Haddad and Patel, the response is just not enough. Keeping users safe is Meta’s responsibility, even more so when VR games like Horizon Worlds are rated for ages 18 and above. Age requirements are incredibly low barriers for access, and the game is chock full of kids — some as young as 7 years old who can find themselves playing with older gamers. 
AdvertisementADVERTISEMENT

We are in an entirely new world, and the law has not caught up.

Carrie Goldberg, Victims' rights attorney
What needs to happen, Al-Haddad says, is government regulation around content moderation. In April, the European Union passed the Digital Services Act, one of the first pieces of legislation that addresses illegal and harmful content and requires tech companies to monitor and rapidly take down any hate or face a fine of up to 6% of the company’s global revenue (for Meta, in 2021, this would have amounted to as high as $7 billion). Currently, tech companies in the United States don’t face any similar regulations or accountability, and — according to Carrie Goldberg of victims' rights law firm C.A. Goldberg, which specializes in online abuse — are “practically immune” to legal liability, because the laws that would make them liable just really don’t exist. And whatever legal statutes are in place are sorely outdated. “[The Communications Decency Act was] created nearly 30 years ago to give burgeoning internet companies some protections against legal liability for content users posts on their platforms,” Goldberg says. This is territory that has no precedent and no laws in place to protect users, explicitly and specifically against digital or virtual sexual assault. “Needless to say, we are in an entirely new world, and the law has not caught up.”  
That extends to the accountability of individual users too. According to Goldberg, as our criminal justice system stands now, there are no specific laws around avatars. “Forcible touching and other sexual assault crimes would generally require some physical component, so we don’t expect law enforcement would, at this point, pursue a crime against an avatar,” she says. While it is possible that someone could request Meta to de-anonymize an avatar if they did experience sexual violence in VR and wanted to sue for emotional distress, Goldberg says it’s never been done. She questions whether or not a jury would be able to see this conduct as extreme and outrageous enough to have caused severe emotional distress to the point where someone should be charged and convicted. 
AdvertisementADVERTISEMENT
With the huge fervor and growing mainstream excitement about the metaverse, sexual assault is one of many problematic issues that need to be addressed. For Patel, the possibility of this happening to someone else is what motivated her to publicly share her experience, instead of shrugging it off and forgetting it happened. “The more I thought about it, I realized that this experience could have happened to anyone, it could have happened to my daughter,” she says. 
Since revealing her virtual sexual assault, Patel says she has received death and rape threats, this time offline, and threats against her daughters, illustrating the real world impact this kind of online abuse can bring. It’s a reality that Patel, like every gamer who logs on, didn't sign up for. “This was not a video game, and I certainly did not choose this.”   
If you have experienced sexual violence and are in need of crisis support, please call the RAINN Sexual Assault Hotline at 1-800-656-HOPE (4673). 
AdvertisementADVERTISEMENT

More from Entertainment

ADVERTISEMENT