ADVERTISEMENT
ADVERTISEMENT

Leaked Facebook Documents Reveal Disturbing Inconsistencies In How The Site Handles Sensitive Issues

Violence, hate speech, and racism are all issues that Facebook has been contending with in recent years. While the social media platform has been fairly vocal about its efforts to combat these problems (including the recent announcement of its plans to hire an additional 3,000 moderators), there's been little insight into how it's fighting back. Until now.
Yesterday, The Guardian published a scathing exposé, “Facebook Files,” which includes leaked guidelines for those moderators, the people who monitor live video and written posts, and decide what stays up and what gets taken down.
Instead of a consistent and thorough handbook, the guidelines are confusing and, oftentimes, disturbing. In one slide published by The Guardian, moderators are advised to take down violent posts which involve protected categories, such as the president, for example, “someone shoot Trump." Yet, it's okay to post threats against women and children, like “to snap a bitch’s neck, make sure to apply your pressure to the middle of her throat,” and “let’s beat up fat kids.” Further, these guidelines fail to define the distinction between “credible” violent posts and non-credible ones.
AdvertisementADVERTISEMENT
It’s also okay for someone to post photos of animal abuse (only “extremely disturbing” images should be “marked as disturbing”) and livestreams of self-harm, because the site “doesn’t want to censor or punish people in distress.”
The Guardian smartly sums up the guidelines in a quiz called “Ignore or delete: could you be a Facebook moderator?” The quiz asks readers which action they would take when looking at various photos, including swastikas, machine guns, and one particularly disturbing image of a furious wrestler with the words: “When you find out your daughter likes black guys.” For the majority of the images, the answer was “ignore.”
In the past year, Facebook has come under increasing scrutiny for how it handles sensitive content. Earlier this month, Mark Zuckerberg was forced to address the slew of violent videos showing up on site, including one of a man who live streamed himself hanging his daughter. Facebook said that over the next year it would add additional moderators to police content, and once again emphasised the importance of fostering a safe community.
When policing sexual content and revenge porn, Facebook has an admittedly monstrous task. It's one that the social network seems to address fairly well — according to the report thousands of violating accounts are deactivated every month. Still, The Guardian reports that the phrase "Hello ladies, wanna suck my cock?" is permitted, while "How about I fuck you in the ass girl?” is not. Where is the line drawn there?
AdvertisementADVERTISEMENT
According to The Guardian, Monika Bickert, Facebook's head of Global Policy Management, said that there will always be “some grey areas” when moderating content. There are also issues of how to address free speech, which have troubled other social media platforms such as Twitter.
But the issues that Facebook seems to regard as grey areas — bullying, violence, and animal abuse, among them — are issues they've previously taken strong stances on. Facebook proudly touts its partnerships with groups directly involved with those issues, including anti-bullying groups such as Bullying Intervention Experts and the Family Online Safety Institute. How can these partnerships exist, if Facebook is not clear in how it moderates bullying on-site?
In a statement offered via email to Refinery29, Monika Bickert, Facebook's head of Global Policy Management, said:
“Keeping people on Facebook safe is the most important thing we do. We work hard to make Facebook as safe as possible while enabling free speech. This requires a lot of thought into detailed and often difficult questions, and getting it right is something we take very seriously. Mark Zuckerberg recently announced that over the next year, we'll be adding 3,000 people to our community operations team around the world — on top of the 4,500 we have today — to review the millions of reports we get every week, and improve the process for doing it quickly. In addition to investing in more people, we're also building better tools to keep our community safe. We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help.”
AdvertisementADVERTISEMENT
The safe community that Bickert references relates back to this past January, when Zuckerberg posted an extensive manifesto. He said that in order to move Facebook forward, the social network will not only focus on “connecting friends and families” but also on developing a sense of community. He explained how Facebook will implement changes geared toward a safer, more inclusive community. The response seemed spurred by the prevalence of fake news that plagued Facebook throughout the election cycle.
It’s easy to drink the Facebook Kool-Aid, especially when you see Zuckerberg’s well-meaning posts about his visits to juvenile justice centres and talks with recovering heroin addicts. The visits are part of his 2017 resolution to visit states where he hasn't spent much time, to “learn about people's hopes and challenges, and how they're thinking about their work and communities.” But all of this messaging is inconsistent in the face of the internal documents uncovered by The Guardian.
Editor's note: This post has been edited and updated.
AdvertisementADVERTISEMENT

More from Tech

ADVERTISEMENT