ADVERTISEMENT
ADVERTISEMENT

It's Too Early To Praise Twitter For Enforcing New Anti-Abuse Rules

Photographed by Bianca Valle.
Azmina Dhrodia is a technology and human rights researcher at Amnesty International.
“After five years of online harassment coupled with offline harassment, I have basically reconciled myself to the fact that I’m prepared to die for the work that I do.”
This is what Pamela Merritt, a blogger and reproductive rights activist from the US, told me when I interviewed her as part of an investigation into online abuse against women. Pamela was remarkably sanguine as she described the barrage of rape and death threats she receives, but her story is chilling. It shows how brave women need to be to express themselves online, in a world where their opinions are so often meet with violent threats.
AdvertisementADVERTISEMENT
I thought about Pamela today when Twitter announced that its new rules aimed at reducing “hateful conduct” have come into force. The company has garnered intense criticism over the years for its failure to tackle the abuse and violence that proliferates on its platform. Last month, things came to a head when criticisms by several celebrities about Twitter’s response to abuse led to the #WomenBoycottTwitter hashtag going viral.
In response, CEO Jack Dorsey promised a “more aggressive stance” on abuse and gave us these new rules. The main change is that Twitter is updating and expanding its rules to include unwanted sexual advances, intimate media, hateful imagery and display names, and violence.
However, Twitter’s rules have never really been the problem. When it comes to many types of abuse women suffer on the platform, it already has a fairly strong set of community guidelines that could be more effective if only they were properly enforced.
Over the past year, I’ve interviewed dozens of women about their experiences of online violence and abuse, and I’ve heard the same thing over and over: Women feel let down by Twitter; posts that clearly breach the company’s standards remain online; they no longer bother reporting abuse; Twitter doesn’t take online abuse seriously.
Pamela says that despite the horrendous abuse she receives, Twitter “rarely takes action” and has only ever taken one of her reports seriously.
Twitter, and other social media platforms, have undoubtedly provided women all over the world with a new space to express themselves, show solidarity, and network. But there’s a flipside: By failing to protect women from online violence and abuse, these same platforms are letting abusers silence women.
AdvertisementADVERTISEMENT
This problem is widespread. Last month Amnesty International commissioned a poll to find out about women’s experiences of online abuse and harassment. Nearly a quarter (23%) of women surveyed across eight countries said they had experienced online abuse or harassment at least once, and more than half of these women said they’d experienced stress, anxiety, or panic attacks as a result. Almost half (46%) of the women who experienced abuse or harassment said it made them fear for their physical safety. This is clearly not something that goes away when they log off.
This abuse is having a disturbing silencing effect. Around three-quarters (76%) of women who had experienced abuse or harassment on social media said it led them to make changes to the way they use the platforms. Most worryingly, around a third (32%) of women said they’d stopped posting content that expressed their opinion on certain issues.
Social media companies need to take this seriously. No woman should feel that sharing her opinion could put her in danger. Ensuring that everyone can participate freely online and without fear is vital to ensuring that the internet promotes freedom of expression equally.
While it’s great that Twitter’s new rules seem to at least in part be aimed at taking abuse against women more seriously, adding more rules is not the end of the story. Twitter needs to explain why content that is clearly in breach of its own policies on violence and abuse often remains online, even when reported.
With its new and expanded rules, there are still some unanswered questions. We currently have no idea how many moderators the company employs to respond to reports of abuse, or how moderators are trained in identifying abuse or hateful conduct. (There have also been calls for more diverse moderators.) We don't know how many reports of abuse Twitter receives, or have a clear look at how it responds to them. These are all important questions: Increased transparency would provide women with a clearer idea of the steps they should take if they are targeted.
Twitter may get good PR for today’s development, but the true test will be whether women continue to censor themselves on the platform out of fear for their safety and wellbeing.

More from Tech

ADVERTISEMENT