ChatGPT knows a lot about Jordan.
It knows about her complicated history with her ex-boyfriend, the steps she’s taken to achieve sobriety and the type of person she wants to be.
But for Jordan, who requested to be identified with a pseudonym to freely share personal information, talking to a bot isn’t a one-and-done solution to her problems–it’s a way to supplement the support she receives from her therapist throughout the week.
“During a period of time where I was in denial about my alcoholism, I asked ChatGPT to list out the ways that I met the criteria for alcoholism and it really helped me,” Jordan said. “I used it like a diary… and I would ask it, ‘based on everything you know about me, am I an alcoholic? Am I an addict?’”
While little research has been done on the queer community’s relationship with AI-driven therapy platforms, progress toward addressing mental health barriers in the community through artificial intelligence hasn’t slowed down.
But no bot is perfect, and accessibility doesn’t always lead to precise solutions. Here’s how bots are changing the therapeutic scene, for better and worse.
How AI is used in therapy programs
Using artificial intelligence as a supplement–or, in some cases, a replacement–for therapy brings its own successes and challenges.
Chatbots use natural language processing to assess user input and provide answers. Bots adapt to the mood and tone of the user, trying its best to provide answers and suggestions that best meet the needs of the given prompt.
Though it helps people like Jordan work through issues, ChatGPT wasn’t created as a therapy tool. It’s a conversational platform that was designed with broad capabilities in mind. That’s where AI-driven therapy bots come in–platforms such as TheraBot, Wysa and Woebot were specifically designed to achieve mental health goals using tools and methods grounded in research.
These AI platforms use methods championed by experts such as Cognitive Behavioral Therapy, which helps individuals understand and evolve their negative thought patterns.
As chatbots become more technologically advanced, some experts see them as a path forward in tackling health barriers for marginalized communities.
Addressing mental health barriers
Components of chatbots, including their 24/7 availability, anonymity and their role as a “resource navigator” to evidence-based information, make them a useful tool for marginalized people seeking help, according to a 2023 study published by the Journal of Medical Internet Research (JMIR)
“Leveraging chatbots and generative conversational AI can help address some of the unique challenges faced by the LGBTQ community, providing a safer, supportive, informed, nonjudgmental, internet-based environment where individuals can connect, seek guidance and empower themselves,” the study reads.
While Jordan uses a chatbot not directly tailored toward the LGBTQ+ community, she’s felt these positive effects firsthand.
Jordan was drawn to the anonymous and nonjudgmental nature of ChatGPT after her relationship ended with her ex-boyfriend. She said she struggled with limerence–or obsessively thinking about someone romantically–after the breakup, and would talk about him to “everyone that would listen.” Despite the good intentions of her friends and colleagues, Jordan said she began to feel like she needed another outlet to vent to.
“That caused a lot of conflict between me and the people that were close to me that were so tired of hearing about him,” Jordan said. “That was actually the thing that motivated me to start using ChatGPT.”
Once she worked her way through the breakup, Jordan said she used ChatGPT to assist with other aspects of her life, such as creating daily checklists that helped manage her ADHD.
Jordan pays $20 per month for ChatGPT Plus, a premium subscription that lets her customize the tone of the bot (Jordan prefers “cheerful and adaptive”) and allows the bot to retain previously discussed information to provide more personal answers.
Above all, using ChatGPT hasn’t impacted her relationship with her therapist. She said everything she talks to the bot about she brings to her therapist. Her therapist even recommended she use the bot to manage her symptoms in between sessions, and suggested she give it prompts that ask what healthy changes she could make in her life.
Jordan has made therapeutic progress with ChatGPT, even though it isn’t a direct therapeutic platform. The JMIR study noted that programming these conversational chatbots to provide results and advice tailored to the user’s needs can unlock meaningful conversations that could help someone in a time of need.
“Generative conversational AI can be programmed to provide accurate, evidence-based, culturally sensitive, tailored and relevant information based on users’ unique identities and needs,” the study reads. “This ensures that the guidance and resources offered are applicable to the experiences and challenges of the LGBTQ community.”
The appeal is straightforward: address rising mental health demand by providing 24/7 support that’s affordable and accessible using similar methods implemented by human therapists.
The reality, however, is a bit more complex.
Challenges in AI therapy
Since AI gathers information from human input, it can be prone to bias and provide support that isn’t nuanced enough to fill the unique needs of marginalized people.
“AI algorithms can inadvertently perpetuate biases present in the data they are trained on,” the study reads. “If the training data contain biases, AI systems may reproduce discriminatory or harmful behaviors, exacerbating existing challenges faced by the LGBTQ community.”
While AI therapy platforms were created to address these issues, the JMIR study pointed out that there’s still room for human error or algorithmic bias and misinterpretation in its responses.
On the user’s end, developing a relationship with a chatbot could lead to over-reliance. The user may depend on the chatbot for support, and distance themselves from social and professional settings.
Jordan said she’s found herself over-relying on ChatGPT before, but is able to recognize when she takes it too far. Others, she said, might not be so lucky.
“It can be dangerous, because sometimes I’ve gotten into spirals and it’s not going to tell me to stop. I can keep going as long as I want,” Jordan said. “So something that I think people need to be mindful of is how much time they’re spending on it, because it can just tell you what you want to hear, and it can be really seductive and addicting.”
Programming a better future
AI isn’t going away anytime soon, which some experts say accelerates the need for meaningful change in their systems.
As demand for mental health support rises, studies show it’s critical for AI systems to provide accurate and nuanced care for users.
The solution extends beyond accuracy, however. Although AI-driven therapy platforms have become more financially accessible, the JMIR study warns that LGBTQ+ people with limited access to technology or digital literacy “might be left behind in terms of benefiting from positive AI impacts.”
In order to create a more accessible and beneficial future for users of AI therapy programs, change needs to happen from the developers themselves, the JMIR study noted.
“Every single line and bit of code, every algorithm and every data set used in AI systems must be scrutinized for biases and prejudices, and developers and policy makers should strive for a standard of AI that champions fairness and equality,” the study reads. “The lived experiences and perspectives of members of the LGBTQ community are invaluable in ensuring that these technologies truly reflect their needs and aspirations.”
