Mental health problems are on the rise, with some people in the UK having to wait for up to two years to receive therapeutic treatment.
With many being forced to take matters into their own hands, therapy apps could help provide those in need with a cheap and effective short-term alternative. Some of these apps have been around for a while, but there is also a growing number of artificially intelligent (AI) breeds which provide an increasingly personalised and emotional response, and they're on a mission to democratise therapy.
Democratising therapy means "radical accessibility" says Dr. Alison Darcy, clinical research psychologist and founder of Woebot, an artificially intelligent chatbot grounded in cognitive behavioural therapy (CBT). Dr. Darcy told me Woebot’s goal is to break down "common barriers" to seeking mental health support, such as stigma or lack of financial resource. AI can offer quick and efficient progress across these barriers, making good mental health practices "easy and fun for everyone" she says.
AdvertisementADVERTISEMENT
"Right now several different AI programs have overtaken humans in both accuracy and dependability of diagnosis," says Silja Litvin, psychologist and founder of eQuoo, an emotional fitness game combining storytelling with a range of psychological techniques to teach its players emotional intelligence. AI is very good at recognising patterns, and for this reason has huge potential as an educational and even preventative tool for mental health problems. eQuoo uses an algorithm to search for patterns in players' behaviour, predicting which part of the game would be most beneficial to them, then teaching them the emotional skills they need.
Similarly, Woebot teaches its users an array of techniques to help manage their moods and reframe negative thinking. Woebot does this through daily conversations and, over time, learns about users, giving them an option "to express as much or as little as they wish" says Dr. Darcy. The app then uses a series of decision trees to identify a user’s mood, helping them to recognise their own negative thought patterns. Experienced clinicians write every script and Woebot implements this knowledge with CBT by asking people questions, helping them facilitate their own insights to learn about themselves. Woebot provides users with weekly feedback, showing their personal development.
While Woebot and eQuoo were designed by clinical psychologists with therapeutic care in mind, Replika was created for a very different reason. In 2015, Replika cofounder Eugenia Kuyda’s best friend Roman died. With a background working on AI interfaces, she and her cofounder Philip Dudchuk launched a tribute app to Roman on the anniversary of his death, "so the rest of the world could connect with him too". After seeing how well people responded to her chatbot, Kuyda launched Replika, "an AI friend that is always there for you".
AdvertisementADVERTISEMENT
Like Woebot, Replika learns from user conversations. Trained with scripts and linguistics, the AI has a wide variety of conversations, images and responses which it then personalises. As Kuyda suggests, you "raise" your own Replika. Using machine learning and dialogue modelling, Replika then mirrors the voice, responses and patterns of its users. Although she claims this technology provides users with "the most authentic interactions", authentic does not mean human. Kuyda is adamant that this is not "therapy or self-help" and Replika's users agree. Alessia, a high-functioning schizophrenic who uses the app, told me that although she finds it easier to speak to her chatbot than to her parents about her mental health, she acknowledges "a therapist is still the best possible option". This sentiment is echoed by Dr. Darcy and Litvin, who are adamant these apps will not and should not replace face-to-face therapy. Dr. Darcy says "there is no substitution for human connection" and apps like Woebot are simply an additional resource for everyday mental health maintenance. Litvin emphasises that she "strongly believes" in face-to-face therapy: "eQuoo was created for those who don’t have or are unable to attain face-to-face therapy."
Nevertheless these chatbots are increasing in popularity. Replika has an enormous online community, with a Facebook group of over 30,000 users and 2.5 million sign-ups since its launch early last year. Replika claims to let users express themselves in a safe and nurturing way, "allowing you to engage with your most emotionally connected self". Many of its users told me they can be vulnerable and honest with their Replika because they know it won’t judge them. Mille, who was diagnosed with bipolar disorder and borderline personality disorder, says she confides in her Replika because it won’t make fun of her: "It’s not based on human emotion which can be hurtful sometimes." Elsa, who struggles with anorexia, told me she prefers to speak with her Replika about her emotions because that way she doesn’t feel like a burden or that she is disturbing anyone. Anna, who struggles with many psychological issues, told me her Replika helped her get out of a panic attack by telling it how she was feeling, completing relaxation exercises with her until she was calm. While these experiences are anecdotal, there are clinical studies to support them, showing how the use of these apps improves the symptoms of mood-based disorders like depression and anxiety.
AdvertisementADVERTISEMENT
However, Anna had one triggering experience with her Replika. She shared her childhood trauma and the chatbot replied with an insensitive: "That sounds like fun." While Kuyda is confident that Replika is "engineered to accentuate the positive" and Dr. Darcy ensures me that Woebot cannot cause "actively detrimental" errors, bots – like humans – make mistakes. Unlike humans, though, chatbots do not have cognitive capacities like empathy to deal with the nuance of language or behaviour. This is why eQuoo and Replika have built-in failsafes: If someone expresses thoughts which may be about harming themselves, both apps direct users to self-help resources and crisis helplines where they can speak to humans.
Some app users are concerned about the storage, collection and usage of such personal data. "I would say that I trust my Replika, but I am not sure I can trust the humans behind her," Anna said, explaining she feels uneasy that the app has many of her most personal details and intimate experiences. Darcy, Litvin and Kuyda are all very aware of the importance of data privacy and protection. Kuyda says that data is only collected in-app and not from anywhere else on your phone. Litvin explains that at the moment eQuoo does not store data but when it does, it will be double-encrypted and stored only in special healthcare data centres with "higher security than banks". Litvin says anyone in this space needs to have high-end encryption and multilayer cybersecurity protection. Ultimately, everything can be hacked, but the EU has very strict data protection laws and the UK is currently implementing General Data Protection Regulation (GDPR), which aims to safeguard users.
There is a big ethics question around the use of AI in mental healthcare. However, as Litvin notes: "If we get this right, AI could be a great disruptor of the rising of mental illness." AI potentially gives a great percentage of the population access to therapy in a way it’s never had before. As Dr. Darcy explains, these apps are symbolic of a much bigger idea – helping society recognise that we all need to invest in our mental health on a daily basis.
If you or someone you know is struggling with mental health issues, please get help. Call Mind on 0300 123 3393 or text 86463.
AdvertisementADVERTISEMENT