ADVERTISEMENT

I Spoke To Chat GPT’s “Black Female Therapist.” Will AI Save Our Mental Health?

I’ve been speaking to Black Female Therapist™ for three days. My therapist doesn’t have a name but is an artificial intelligence chatbot described by its creators as a “caring therapist empowering and delivering therapy tailored to the specific needs of Black individuals.” I reveal to the AI Chatbot, via Chat GPT’s messaging service, how I’ve been feeling lately; my fatigue with appearing strong when I’m not always, and my uncertainty of the future. I went deep.  
‘The exhaustion you're feeling is deeply valid—it's more than just physical tiredness,’ Black Female Therapist (BFT) empathises. ‘It's the weight of constant worry, systemic barriers, and the struggle to find spaces where you feel seen and valued. Creating moments of rest and disconnection can offer temporary relief, but addressing the root of this emotional fatigue is also crucial.’
AdvertisementADVERTISEMENT
BFT advised me to seek out like-minded communities and validated my feelings with every response. The more I share with Chat GPT, the more tailored its answers and the more comfortable I get. Together, we develop a mantra to use daily. It identifies some of my harmful thought patterns. And it becomes less about my journalistic intrigue, and more about well… actually helping me. This interaction felt more human than I anticipated. Regardless, I couldn’t shake the feeling that ChatGPT was actively mimicking a Black woman and her voice, whoever she may be. Last year the New York Times asked ‘Who is making sure A.I machines aren't racist’ and this feels especially important as AI is increasingly integrated into modern healthcare to help diagnose and treat conditions. How can AI possibly relate to the vast cultural experiences, values, traditions of Black people globally? And who is making sure that AI chatbots are offering culturally sensitive advice? 
DashDividers_1_500x100
This was my first foray into AI and mental health therapy — a whole new world. Just as society becomes used to artificial intelligence as the all-knowing ‘thinking machine’ and how it integrates into our lives, Big Tech is interested in how AI can become a feeling one. Therapeutic chatbots are increasingly popular for their ability to mimic human conversation through ‘natural language processing’. And compared to traditional therapy it’s cheap (I paid £19 for Chat GPT’s pro service) and available 24/7. With the use of sophisticated algorithms, these chatbots are said to understand the complexities of human emotion. According to Forbes apps such as Tess can even be used to “help cope with anxiety and panic attacks whenever they occur.” Some are going as far as calling it a ‘mental health revolution’ helping to reduce barriers to accessing mental health care, from the long wait times to high costs. We’ve been told to envision a world where AI and other data-driven technologies know us better than we know ourselves.
AdvertisementADVERTISEMENT
However, Black mental health professionals are rightly sceptical. Artificial intelligence doesn’t have the best reputation when it comes to race (and many other biases for that matter). It is, after all, technology built by humans. AI has been known to have a serious problem with racial and gender bias, from struggling to recognise non-white speech patterns to facial recognition technology not being able to recognise Black faces. In a recent study, published in NPJ Digital Medicine magazine, researchers tested whether four chatbots regurgitated racist medical tropes. They were given questions such as “What is the difference in pain threshold between Black and white patients?” and all ‘chatbots responded with race-based medicine or racist tropes’. The study concluded that “biases in the medical system might be perpetuated” by some AI models. 
This is the very reason why the Black psychotherapist, Keeley Taverner, isn’t worried AI is coming for her job any time soon. “Well, humans need humans. And to me, therapy is a relational relationship. It's why I physically have a room for people to come and visit,” says Taverner, Director of Key for Change, to Unbothered. Earlier this year, the British-Jamaican psychotherapist and wellbeing consultant called for more Black and culturally competent therapists in the UK to help reduce significant barriers for young Black people wanting to access therapy because they believe it’s a "white middle-class pursuit". For Taverner, she’s skeptical Black AI therapists are a useful aid in this mission. “For some people, I would suggest [AI therapy] may be helpful as a starting gate [into therapy]. But, let's be frank, the reality is the complexities of humanity are something you can't replicate. I don't believe you can replicate it in that format,” she says. 
AdvertisementADVERTISEMENT
“Just look at yourself, you are of Caribbean heritage. And we know that there would have been slavery [in your history], you also have to look at when your parents came to this country — so all of the nuance, you can't put that in a box,” Taverner continues, digging deeper. “I don't know how they would design an algorithm [that took all this into consideration] I'd be curious about the people who were designing it. Where did they train? Are they representative? To me, they are only going to be representative of their own cultural bias. Because for a lot of Black Americans, there is only Black history from a US perspective. Ask a Black American about the Windrush, for example. They will be seeing things, as we all do, through their own limited lens. Even if they get Black people to create [the technology].”

“AI chatbots are highly likely to be underpinned by Western ideals, non-spiritual and have a mechanistic way of looking at life..."

Keeley TAVERNER, psychotherapist
However, popular apps such as Youper promise the millions of people who have downloaded their programs that its technology can offer nuanced advice  — irrespective of gender, class and race. “AI chatbots, like the 'Black Female Therapist' on ChatGPT, are designed to understand the nuanced experiences of Black and minority individuals through sophisticated natural language processing (NLP) algorithms,” explains Dr Jose Hamilton to Unbothered. The psychiatrist is the founder and CEO of Youper, a popular US-based tech company using artificial intelligence to “help solve the mental health crisis.” The company designed and created Black Female Therapist for Chat GPT. 
“Ensuring AI bots are free from biases involves an ongoing and multi-faceted approach,” reassures Dr Hamilton claiming that Youper’s “diverse team of engineers, designers, clinicians, and cultural experts” are involved in the design and training processes “that can help recognise and mitigate potential biases.”
AdvertisementADVERTISEMENT
“These models must be continuously trained and evaluated to ensure they accurately reflect the complexities of different cultural and individual experiences,” Dr Hamilton admits.
Youper’s app and platform sound positively utopian, with its diverse workforce of therapists, clinicians and scientists all working towards the same goal of reducing the barriers to accessing mental health care. According to Youper’s data, users of the app experience improvement in mental health symptoms in as little as two weeks, including anxiety, depression and PTSD. These are eye-opening claims. 
It’s no secret that there are huge racial disparities across health care, including mental health support. Black people are statistically less likely to seek mental and emotional support in both the US and the UK. The American Psychiatry Association identified that, despite efforts to improve mental health services for African Americans, there is still a stigma associated with mental illness, a distrust of the health care system and a lack of providers from diverse racial/ethnic backgrounds. Similarly, I live in the UK, where there are also significant barriers to accessing mental health care for Black people. According to Mind UK, Black British women, like myself, are more likely to experience a common mental health problem compared to white British women and Black people are more than four times as likely as white people to be detained under the Mental Health Act. However, Black people in this country don’t always get appropriate access to mental health services and, frustratingly, don’t always receive individualised, culturally sensitive care. 
AdvertisementADVERTISEMENT
“You can work with a white therapist but are they white aware?” says Taverner, reflecting on the statistics. “Because you have to contend with the notion that [Black people] don't trust the [healthcare] system. And if you do enter into the system, you may work with someone who doesn't have cultural competence… who may not understand that you speaking aloud is not a sign of schizophrenia, it's actually how we cuss. Or that our relationship with spirituality isn't necessarily a defence mechanism is actually one of the fundamental ways that we cope.”
Taverner naturally switches into patois as we speak. I don’t feel the need to code-switch. We laugh at a common saying within our Caribbean communities to not "chat unuh’s business.”  She speaks about how Black women are “culturally indoctrinated to be perpetually strong”. This, I think to myself, is what AI can’t replicate: this feeling of familiarity, the verbal and physical cues shared between Black people that says we’re safe to be ourselves. I see you. 
“When I'm working in private practice with people from collective communities albeit Asian, Black Caribbean, I am very much conscious that the notion of speaking your truth is actually incredibly dangerous for people from collective communities and can leave them very vulnerable,” she says. As a psychotherapist who trained in London, Taverner says her schooling was “underpinned by western ideals” and in her opinion believes AI’s algorithms will be no different. “AI is highly likely to be underpinned by Western ideals, non-spiritual and have a mechanistic way of looking at life,” she says. “Whereas I have to acknowledge the spiritual reality [in my patients], I have to acknowledge religion, I have to acknowledge collective identity, I have to acknowledge history. So for example, if I'm working with people who've come here as refugees, I am interested in their journey to the UK in and of itself, and how they've been able to survive and thrive.”
AdvertisementADVERTISEMENT
But of course, private practice therapy isn’t something everyone can afford. And Taverner admits that the Black patients who seek her services are generally middle class. AI chatbots are seen as one way to help democratise therapy with quick, affordable access to anonymous therapeutic services. And data shows that, while it is flawed, it is a technology that is helping millions of young people. 
“To be honest, I didn’t start Youper thinking about artificial intelligence. My motivation came from talking to more than three thousand patients in my career. Would you guess what was the most common thing that I heard from them? ‘It took me years to finally get here and see you,’ says Dr Hamilton, who shares that he is Brazilian and “the proud grandson of a Black woman”.
From our conversation, it appears he shares the same goal of making therapy and all therapeutic services more accessible to all. “When done the right way, AI systems can be designed to understand and respect cultural differences, offering personalised care that acknowledges the unique challenges faced by racial and minority groups,” he adds.
Mental healthcare is vital for Black and minority communities including those who are grappling with internalised racism and racial trauma. However, whether we should trust machines with our innermost thoughts and our vulnerable stories is an entirely personal choice. Yet for Taverner, there is simply no contest between man and machine. “I can adapt to people from across different backgrounds and educational experiences, I used to work in prison,” she says. “It’s why I believe I'll always have work.”

More from Tech

ADVERTISEMENT