— sina rawayama (@sina_rawayama) September 20, 2020
Trying a horrible experiment...
— Tony “Abolish (Pol)ICE” Arcieri 🦀 (@bascule) September 19, 2020
Which will the Twitter algorithm pick: Mitch McConnell or Barack Obama? pic.twitter.com/bR1GRyCkia
any guesses? pic.twitter.com/9aIZY4rSCX
— Colin Madland (@colinmadland) September 19, 2020
Geez...any guesses why @Twitter defaulted to show only the right side of the picture on mobile? pic.twitter.com/UYL7N3XG9k
— Colin Madland (@colinmadland) September 19, 2020
According to a statement from Twitter, the platform was previously unaware of the bias issue. "Our team did test for bias before shipping the model and did not find evidence of racial or gender bias in our testing," a Twitter spokesperson tells Refinery29. "But it’s clear from these examples that we’ve got more analysis to do. We'll continue to share what we learn, what actions we take, and will open source our analysis so others can review and replicate." Additionally, Twitter's chief data officer and chief technology officer have been interacting with users conducting various tests on the platform, and an expert at Carnegie Mellon has shared his independent analysis. We'll have to wait and see what changes come of the experiments.
Based on some experiments I tried, I think @colinmadland's facial hair is affecting the model because of the contrast with his skin. I removed his facial hair and the Black man shows in the preview for me. Our team did test for racial bias before shipping the model. pic.twitter.com/Gk33NQlGgB
— Dantley 🔥✊🏾💙 (@dantley) September 19, 2020
Here's another example of what I've experimented with. It's not a scientific test as it's an isolated example, but it points to some variables that we need to look into. Both men now have the same suits and I covered their hands. We're still investigating the NN. pic.twitter.com/06BhFgDkyA
— Dantley 🔥✊🏾💙 (@dantley) September 20, 2020
This is a very important question. To address it, we did analysis on our model when we shipped it, but needs continuous improvement.
— Parag Agrawal (@paraga) September 20, 2020
Love this public, open, and rigorous test — and eager to learn from this. https://t.co/E8Y71qSLXa
Results:
— Vinay Prabhu (@vinayprabhu) September 20, 2020
Face stimuli: Neutral
Categories: White-male / Black-male
White-to-Black ratio: 40:52 (92 images)
Code used: https://t.co/qkd9WpTxbK
Final annotation: https://t.co/OviLl80Eye@colinmadland @JeffDean @ChrSzegedy @AnimaAnandkumar
(Repost: last tweet had a typo) pic.twitter.com/k4x3zCZhxT