Until recently, Tonya Meisenbach couldn't bring herself to smile. In 2019, she suffered third- and fourth-degree burns to the upper 35% of her body in an accident that affected her face. "For a long time, when I smiled, I couldn't stand the way my face looked," she says.
About two years into recovery, her caretaker confronted her. She told Meisenbach to look hard at herself in the mirror — to think about how far she'd come since the fire. "My face had healed so much better than I could’ve ever imagined and it did so without all of the surgeries that had been planned," Meisenbach remembers. "To me it was miraculous."
AdvertisementADVERTISEMENT
This prompted Meisenbach to finally post on social media about what she'd been through. Most of her followers had no idea what had happened; she decided to answer their questions by making a video.
The clip didn't just reach people she knew. It blew up. She started getting messages from burn survivors all over the world. "At that point, I'd never met another burn survivor other than in the waiting room of my doctor's office," she says. Meisenbach started posting videos of herself putting on makeup or giving encouraging pep talks on YouTube, Instagram and TikTok, which she made from her home in Alpharetta, GA. She was becoming an influencer, posting under the handle @burnedbeauty2018.
She didn’t just gain a community. There were also deals with big brands like L'Oréal.
She still didn't smile in most of her posts. "I was doing the Vogue; straight-faced, kind of pouty. But eventually people started commenting on videos and were like, 'We love it when you smile!'" She took this to heart. She started with shy smirks and slowly progressed to full-on beaming. "It wasn't like smiling to appease people," she says, "but just smiling because I wasn't afraid anymore of being judged."
Still, as Meisenbach's platform grew, she also attracted trolls who were unkind about the way she looked. They would report her content, which she believed led to some of it being taken down. But when videos were removed from TikTok, it wasn't just because of real people reporting her, she tells Refinery29. TikTok's AI was at work too, she believes — some of her posts would be taken down by the app almost immediately after she put them up.
AdvertisementADVERTISEMENT
Meisenbach says this has happened on multiple occasions, at least since February 2022, and Refinery29 has spoken to others with facial differences and activists who say this is an issue.
Just a few weeks ago, another one of Meisenbach's videos was taken down. A TikTok notification reviewed by Refinery29 listed the post’s "violation reasons." It said she'd posted "violent and graphic content." It went on to say: "We do not allow content that is excessively gruesome or shocking, especially that promotes or glorifies abject violence or suffering." For Meisenbach, this language stung.
Her post — which can also be viewed here on Instagram — features a few selfies in which Meisenbach's burns are visible and a video of her dancing to Beyoncé's "Break My Soul." The most offensive thing about the video is that it was removed.
This isn't just a TikTok problem. Other people in the burn survivor community — and folks with facial differences — have also had their images flagged on other social media platforms and even dating apps, according to burn survivors interviewed by Refinery29 and the Phoenix Society for Burn Survivors, a nonprofit working to advocate for and help the community. Often, their content is tagged as "violent," "self-harm," "sensitive" or "graphic."
The problem is rooted in a combination of human bias and flawed artificial intelligence (AI), says Phyllida Swift, the president of Face Equality International, an organization dedicated to fighting for equality for those with facial differences. A lot of it has to do with what AI is learning (from human programmers) about what a person "should" look like.
AdvertisementADVERTISEMENT
"We've been aware of people having their photos and accounts removed or blurred, dating back about five years now," says Swift, referencing a case in which Facebook and Instagram censored a photo series called Behind The Scars. "As we've begun to understand more about AI, we've learned these aren't just random robots within the algorithm. They're built by humans."
Many tech platforms use AI facial recognition, which is trained on thousands, if not millions, of images — and the Rolodex of visuals may or may not include depictions of those with facial differences. Eventually, the technology learns to flag certain differences, including scars, particularly if the AI is being trained to moderate for "self-harm" or "violence."
Swift says burn survivors and others with facial differences typically aren’t being accounted for when facial and image recognition systems are trained. The datasets are too narrow. This is one of many problematic kinks in our facial recognition systems; studies have already found that many algorithms have biases that disproportionately impact women and minorities. Now, activists like Swift are trying to call attention to the fact that the technology is failing people with facial differences, too. "They're essentially being told that their face isn't a human face," Swift says. "It's literally dehumanizing."
Meanwhile, most platforms we spoke with, including TikTok, still have human moderators who are part of the process beyond AI. At TikTok, for instance, if its technology comes to the conclusion that a post is violating their guidelines (and its technology has a high level of certainty about this), the app will moderate the post immediately. But if the technology isn't as certain, it kicks the post over to a human moderator who will view it and decide if it can stay, a spokesperson told Refinery29. Even then, sometimes the content of those with facial differences is censored.
AdvertisementADVERTISEMENT
"We would hope that a human moderator would be able to distinguish between harmful content and someone simply proudly displaying their differences, but on several occasions, reporting systems have also failed to pick up on these types of harms," Swift says.
"Any bias AI has initially stems from human bias," says Kathleen Bogart, PhD, an associate professor of psychology and director of the Disability and Social Interaction Lab at Oregon State University.
These preconceptions are deeply rooted enough that many people — including most programmers training AI — may not even know they have them. "Look at characters such as Kylo Ren in Star Wars, the Joker in The Dark Knight and Freddy Krueger in A Nightmare on Elm Street," says Amber Wilcox, a burn survivor and volunteer at Phoenix Society for Burn Survivors. "As individuals, we’re shown by the media from a very young age that people with scars are 'evil' — look at Scar in The Lion King.”
Meanwhile, our artificial intelligence is often being set up and trained by privileged, non-disabled men in the tech sector, says Susan Scott-Parker, a global disability advocate and the founder of Disability Ethical AI (although stats vary slightly, most surveys suggest about 90% of software developers are male). "The people who purchase and commission this technology don't see disability as a part of their world," Scott-Parker adds. This is evident. And even when TikTok has taken disability into consideration, it hasn't always achieved inclusivity. In 2019, TikTok was actively preventing videos of disabled users and those with facial differences from going viral, the digital rights news site Netzpolitik discovered. TikTok admitted the practice was not the right way to combat bullying and has since changed the policy.
AdvertisementADVERTISEMENT
Meisenbach says her recent experience on TikTok severely impacted her mental health and she considered leaving social media for good. "But then I realized, what I was doing before was sitting on the couch and hiding. I started creating to help myself and help other people, so I wasn’t going to be bullied out of it," she says. "Not by trolls and not by TikTok."
A TikTok spokesperson told Refinery29: "We are proud that TikTok is a home for people to share their stories, experiences, and recovery journeys. We are always investing in our features and processes, and welcome feedback from our community on how we can improve their experience."
But TikTok has also taken down content posted by others with facial differences, including Joshua Dixon. The 23-year-old Chicago native says the app has taken down his content and told him one of his videos was "violent and graphic."
Dixon's experiences go beyond TikTok. The dating apps Bumble and Tinder have also removed photos in which his facial differences were visible. "Bumble took down so many of my headshots," he says. "I took some really great photos of me living my best life on my birthday in Vegas and all of those photos got taken down except ones of me wearing sunglasses. And it's hard because you want to have a variety of photos on a dating app." Meanwhile, Tinder removed five of his photos last summer. He says that between the removal of his photos and the general bullying he endured from other users, he eventually deleted the apps.
AdvertisementADVERTISEMENT
"I know I look different," he says. "When I was 8 years old, I lost 80% of my face and lost my eyelids and left eye [because of] a run-in with our pitbull. It's been pretty terrible dating as someone who is different. On the apps, people have the guts to say things they'd never say in person." To add tech platforms censoring your photos to the mix? "It's cruel," he says.
"By taking someone's post or profile down, it's inherently saying: 'You're wrong and you don't deserve to be here,'" says Jennifer Harris, MSW, LICSW, a psychotherapist and social worker who works with burn survivors. "You're taking away their community, and their support. I conceptualize trauma as something that happens to you that changes your identity without your permission... And this is essentially creating more trauma and loss for people who are trying to heal and cope through connection and community."
Bumble has a track record of AI initiatives that aim to help women and marginalized groups, including technology that automatically blurs unwanted photos showing nudity or explicit content. It pointed to such initiatives in response to Refinery29's request for comment. A spokesperson added: "Bumble prioritizes a safe and empowering community through a combination of automated features and a dedicated human support network. We strive to continuously improve our systems. Despite our efforts, mistakes may occur."
A spokesperson from Tinder told Refinery29 they couldn't comment on why Dixon's specific photos were removed, adding that there were many possible reasons someone's photo could be removed based on Tinder's community guidelines. However, Dixon shared examples with Refinery29 that seemed not to violate any of Tinder's posted guidelines.
AdvertisementADVERTISEMENT
The spokesperson added that Tinder uses AI which may flag photos that need to be reviewed, but actual human moderators do the removal. The spokesperson said that their technology works by matching pictures in specific profiles against the other images in the same profile and not against other random images. This is called "facial matching" and is different from typical "facial recognition" systems.
For the first six or so months, even when TikTok did take down Meisenbach's posts, it would review them and put them back up within minutes if she complained and argued the post should be reinstated. But in August of last year, that changed.
Meisenbach made one transformation video showing what she looked like before her burns, toggling through time to the present. She says the clip garnered hundreds of views within only a few hours, until it was taken down. After disputing the take-down, she received the same "violent and graphic content" message within minutes of her dissent. Meisenbach assumed this meant the post was gone for good. She says this caused her to receive an official "account warning," which seemed to hinder the number of views her videos would receive. She also worried this meant she could be totally deplatformed. "That’s not ideal for an influencer," she said at the end of February.
After Refinery29 reached out to TikTok for comment, the post was reinstated. A TikTok spokesperson confirmed the post was put back up in response to Refinery29's request for comment, because it was determined that it was over-moderation. The spokesperson also said they removed a violation from her account. But days after the post was reinstated, Meisenbach's account again came under fire, and she received an account update saying: "Due to multiple violations of our Community Guidelines, you're suspended from editing your profile for 24 hours."
AdvertisementADVERTISEMENT
Meanwhile, last June, Dixon spoke on TikTok for the first time about the fact that he has prosthetic ears. "I made a video about how they work and how they're connected to my head so others in a similar situation might know they're an option," he says. "Within five hours or so, the videos were taken off TikTok and Instagram for 'inappropriate content.' I fought it. I was like, 'This isn't a bad video. It's about raising awareness.' But [TikTok] wouldn't put it back up." A TikTok spokesperson says Dixon's video was put back after an appeal, though he disputes this.
A spokesperson for Meta (which owns Instagram) told Refinery29: "By listening to people with lived experiences and subject matter experts we hope to proactively advance the responsible design and operation of AI systems. We are consistently seeking to improve where automated systems are used by creating more diverse datasets and engage [sic] with industry experts. To find, review, and take action on content that may go against our Community Standards, we use technology and human reviewers." As of 2021, Meta no longer uses its facial recognition system. This month, it also launched an initiative to help AI researchers make their datasets more diverse.
Last year, the UN published a report about the impact of AI on people with disabilities and differences, highlighting ways the technology can both help and hurt. The report suggests countries make rules for AI tools that use facial recognition technology to prevent discrimination. But for now, Swift says, companies are under no legal obligation to improve their systems.
AdvertisementADVERTISEMENT
In the meantime, Swift says she has tried to reach out to social media platforms on behalf of folks like Meisenbach and Dixon. "Some platforms have been quite cagey about what the problem was, and about how many human moderators were aware this is a problem — and about what they were doing to address this," she says. "We've tried to offer ourselves up and work with all these platforms, but it’s been difficult to move forward and there’s been a lot of resistance… We're engaging with Meta, TikTok, and Twitter, but progress is slow."
Meisenbach is still frustrated with TikTok and will sometimes run into AI issues with other platforms, too. Most recently she ran into an issue with Instagram's new paid verification process when Meta wouldn't authenticate her driver's license because the picture was taken before the fire. (A Meta spokesperson noted: "Meta Verified is still in early testing and we continue to gather feedback and learn about what works best… It is not our intention to prevent people who are eligible from signing up for Meta Verified for any reason.")
But Meisenbach still finds TikTok to be "the grossest place online," especially because she's never felt like she could dialogue with a human moderator, even if they're involved in the process.
"Through all this, I've honestly felt like the company was bullying me," she says of TikTok. But she adds: "I'm always going to be putting these positive messages on social media." And she's going to smile as she does so.
For his part, Dixon eventually got off the dating apps — there was too much hate. Not long after, he met someone and has been in a relationship for about five months. "I'm the happiest I've probably ever been now," he says. "I feel love is possible. It's hard for us who look different, to even have hope about that. But you have to create your own hope even when you can't see it. And you have to find the people who treat you like a human."
AdvertisementADVERTISEMENT