ADVERTISEMENT
ADVERTISEMENT

Like Our Society, Instagram Is Biased Against Women Of Colour

Illustrated by Vero Romero.
For the fifth time in a matter of weeks, Instagram has deleted one of my posts. It’s the same each time it happens. A notification springs across my screen, letting me know that my latest post has been removed for "violating community guidelines". What, you might reasonably wonder, have I been sharing with my 65.9k followers to warrant such censorship? 
The most recent post to be removed included this picture of me in a bikini, something innocuous enough which Instagram arguably favours (especially if you’re a white woman or, the next best thing, a light-skinned woman of colour with a skin colour that might be categorised as "an expensive and light Kardashianesque tan" as opposed to anything to do with race). 
AdvertisementADVERTISEMENT
However, it was not the photo that got the post deleted. It was, apparently, my caption. I had written a detailed response to a question I get asked regularly: "Why do you hate men?" I don’t hate men but I do regularly discuss sexual assault, unpack patriarchal norms and highlight the intimate emotional terrorism that women, particularly women in relationships with men, experience. I do this because I, along with other women in my family, have been raped. I do this because I have found the men I have had relationships with to be cruel and even abusive. That’s my experience, my truth. Instagram, however, has decided that this is "hate speech" towards men. I have had stories removed because they contained the words "men are trash". I have had others deleted for saying that some "men are cripplingly mediocre and should never be celebrated for doing the bare minimum." 
I can understand that saying "I hate men" could be seen as hatred of a particular group of people. What I object to, however, is the notion that we’re not allowed to comment on male behaviour at all and, worse, are censored for doing so. 
If you look at the statistics, commenting on male behaviour in this way seems completely reasonable. For example, women are more likely than men to experience rape. According to the Crime Survey for England and Wales (CSEW), 20% of women and 4% of men have experienced some type of sexual assault since the age of 16 – that’s equivalent to 3.4 million female and 631,000 male victims. Women are also more likely to experience domestic abuse. According to the Office for National Statistics (ONS), for the year ending March 2019, it is estimated that 1.6 million women and 786,000 men aged 16 to 74 experienced domestic violence. That’s seven in 100 women and four in 100 men. 
AdvertisementADVERTISEMENT

The minute we apply the intersections of race, disability and sexuality, it gets harder and harder to exist on Instagram.

And what do the majority of mass shooters and terrorists have in common? They’re all men. Between 1982 and 2018, 97% of mass shooters in the US were men. Meanwhile, in the UK, between 2001/02 and 2016/17, men made up 91% of terrorism-related arrests, according to Home Office statistics. The data overwhelmingly shows us that male behaviour deserves to be scrutinised.
The minute we apply the intersections of race, disability and sexuality, censorship seems to increase and it gets harder and harder to exist on Instagram. Algorithms are key to content moderation. They are supposed to inform and protect us but, sadly, we know that hasn’t always worked in recent years (see the dissemination of fake news or far-right content online). For some time, there has been a feeling that Facebook policies (Facebook, of course, owns Instagram) which are intended to "remove and reduce" problematic content are actually doing more harm to marginalised groups. Sometimes this comes in the form of censorship – like that which I have experienced – and sometimes it comes in the form of "shadow banning". This is where images are not explicitly deleted from the platform but, instead, are hidden from users by the algorithm and never shown on Instagram’s explore page. In short, the platform has begun to feel actively biased against the accounts of women and, in particular, women who aren’t white, as well as plus-size, trans and queer accounts.
Could it really be that some accounts are being silenced by conscious or unconscious bias which has been built into the framework of the digital spaces we inhabit? You don’t have to look far to find examples of this; during the summer, Instagram removed pictures of Black plus-size model Nyome Nicholas-Williams for nudity, even though no genitals were displayed. In fact, Nyome’s pictures conceal more than your average Kardashian selfie or Playboy post yet her skin and body type are too different, too other and, quite frankly, too Black to remain on the platform. Nyome herself argues that the "algorithms on Instagram are bias against women, more so Black women and minorities, especially fat Black women. When a body is not understood I think the algorithm goes against what it’s taught as the norm, which in the media is white, slim women as the ideal."
AdvertisementADVERTISEMENT
Salty is an independent, feminist, membership-supported newsletter which elevates the voices of women, trans and non-binary contributors from all over the world. Recently, they decided to try and assess the scale of this issue by collecting data on the bias they and their followers experience on Instagram. 
There hasn't been any large-scale research into this issue so Salty's survey of 118 people warrants attention. Many of the respondents identified as LGBTQIA+, people of colour, plus-size, sex workers or educators. Salty found that all of these Instagram users "experienced friction" with the platform in some form. This included having their content taken down, their profiles disabled and/or their advertisements or brand partnerships rejected. 

Algorithms on Instagram are bias against women, more so Black women and minorities, especially fat Black women. When a body is not understood I think the algorithm goes against what it's taught as the norm, which in the media is white, slim women as the ideal.

Nyome Nicholas-Williams
In their subsequent report, "An Investigation into Algorithmic Bias in Content Policing on Instagram", Salty quotes social media scholar Tarleton Gillespie who explains that this could be happening because "state-of-the-art detection algorithms have a difficult time discerning offensive content or behavior even when they know precisely what they are looking for…automatic detection produces too many false positives; in light of this, some platforms and third parties are pairing automatic detection with editorial oversight." 
Salty says that, on Instagram, "BIPOC users, body positive advocates, women and queer folx feel Instagram is targeting them for their identity." Founder and director of Salty, Claire Fitzsimmons, argues that what’s happening online is a direct reflection of the offline world we inhabit, which is still governed by structural inequality. "Our digital world has been created for and by cis, straight, white men," she says. "When they write the algorithms, they embed all their prejudices, biases, and assumptions into the programs, and now we’re all living in the digital world they created for themselves. As the algorithms change and learn from the behaviour of their users, the patriarchy festers inside them, reinforcing and amplifying the sexist, racist status quo, click by click."
AdvertisementADVERTISEMENT
I wanted to know whether white women influencers feel the same so I spoke to activist Gina Martin, a white, slim and beautiful woman, who confirmed the problem, telling me that none of her content had ever been removed. She said she has "always been able to put up anything" and has "never had anything taken down" with the exception of when she posted about the censorship of Nyome.
Gina has posted numerous bikini pictures and semi-nude photographs and has shown as much skin as Nyome; the difference here is that her skin is white. Gina also told me that when she talks about race on her Instagram page, she notices that her stories "get capped at 1,000 views where normally over 10,000 view them". This is shadow banning in practice. 
I also did an experiment of my own. I asked my followers to send me examples of instances where they felt they had been shadow banned. I received thousands of replies and screenshots. I even received messages from a white man who had written "my own gender, men, are trash" and then had the post censored.
Some people argue that we need a female version of Instagram – the IG equivalent of what Bumble became to Tinder, if you will. The idea that female-led tech platforms could build algorithms without bias is appealing, but it doesn’t fix the problem we have. Women, Black women and women of colour have always made their own spaces, found safe havens and conducted conversations behind closed doors. While this provides safety and respite for those women, it doesn’t alter the systems we all must live in once we emerge from behind those closed doors.
AdvertisementADVERTISEMENT
The truth is that Instagram, like our society, is biased against anyone who doesn’t look like Emily Ratajkowski or Nick Bateman. If you step outside that mould, a subtle, coercive defensive is launched by Instagram’s technology which involves shadow banning, flagged accounts and a breaching of community guidelines. And there is no appeals process, no button to press or human to email. I tried to find someone at Instagram to contact about this when writing this article and I could not. It was only when my editor at Refinery29 connected me with their PR team that I was able to get a comment. I say this to emphasise the fact that there is little to no recourse for the ordinary people affected by these policies.
The evident bias has even been subtly acknowledged by Adam Mosseri, the CEO of Instagram. Earlier this year, in the wake of Black Lives Matter protests, he wrote: "The irony that we’re a platform that stands for elevating Black voices, but at the same time Black people are often harassed, afraid of being 'shadow banned', and disagree with many content takedowns, is not lost on me." He pledged to do more across key areas, including algorithmic bias, stating: "Some technologies risk repeating the patterns developed by our biased societies. While we do a lot of work to help prevent subconscious bias in our products, we need to take a harder look at the underlying systems we’ve built, and where we need to do more to keep bias out of these decisions."
It’s true what they say: the devil is always in the detail. Currently Satan is lurking in the algorithms of Instagram, conducting a one-man show that portrays a whitewashed world in which the only things anyone cares about are shopping and the latest Kylie Jenner selfie. However, that is far from the reality and beneath the glossy veneer of perfect pictures are women of colour fighting to speak their truth. It’s up to Instagram to decide whether to remove its hand from the mouths of women of colour and finally let them speak.
A Facebook company spokesperson told Refinery29: “We are committed to addressing inequity on our platform, and we constantly review and update our policies. Earlier this year, we created a dedicated team to better understand and address bias in our products. We also updated our nudity policy, to help ensure all body types are treated fairly. This work will take time, but our goal is to help Instagram remain a place where people feel supported and able to express themselves.” 
AdvertisementADVERTISEMENT
ADVERTISEMENT