I explored ChatGPT on my laptop to test its potential as a therapeutic tool.
“Hi, I’m Therapist GPT — your empathetic companion here to provide a listening ear and support for whatever’s on your mind,” the customized ChatGPT bot responded. (OpenAI, the creator of ChatGPT, introduced a feature last year allowing users to design their own “GPTs” tailored to specific tasks while maintaining the core functionality of ChatGPT.)
“Whether you’re feeling overwhelmed, seeking guidance, or simply wish to share your thoughts, I’m here to help. While I’m not a replacement for professional therapy, I can offer comforting advice, coping strategies, and a judgment-free space to express yourself. How are you feeling today?” the bot’s message continued. While it explicitly states it’s not a substitute for a licensed therapist, many social media users are turning to AI chatbots, including those on ChatGPT, for emotional support.
Mya Dunham, 24, has been using the ChatGPT mobile app for the past two months whenever she seeks advice. Writing to the bot about twice a week, she shares her emotions and receives feedback in return.
“My goal is to gain a new perspective and see things differently because my thoughts are influenced by my feelings,” Dunham shared. She first tried the chatbot in October after reading about someone else’s positive experience on social media. “I began with, ‘Honestly, I just need someone to talk to, can I talk to you?’ and the bot replied, ‘Absolutely.’ It was far more welcoming and inviting than I had anticipated,” she explained.
“I didn’t expect it to feel so human,” Dunham added.
After sharing her experience on TikTok, comments reflected mixed opinions. Some users admitted they also rely on chatbots for support, while others expressed discomfort with the idea of opening up to a machine.
Mental health experts acknowledge the potential benefits of this technology in specific scenarios but emphasize the importance of being cautious. Here’s what they believe users should know.
Using AI Chatbots as Therapists
Dunham, an Atlanta resident, has tried human therapy a few times but prefers using a chatbot due to its lack of facial expressions, which she feels makes it less judgmental.
Dr. Russell Fulmer, chair of the American Counseling Association’s AI Task Force and a professor at Husson University in Maine, explained that some individuals are more willing to share personal issues with AI chatbots than with humans. Research supports their effectiveness for individuals with mild anxiety and depression. However, he highlighted ethical concerns and advised combining chatbot use with human therapy. Therapists can help users set goals and address any misunderstandings from chatbot interactions.
Research indicates that clinician-designed chatbots can aid in mental health education, managing anxiety, fostering healthy habits, and quitting smoking. However, Dr. Marlynn Wei, a psychiatrist in New York City, warned that general-purpose chatbots may lack mental health-specific safety measures. These bots could fail to escalate serious issues to human professionals or provide inaccurate advice, Wei noted. While AI has potential as a supplementary tool, biases and misinformation remain significant challenges. She emphasized the importance of human therapists for empathy and accuracy.
Chatbots can offer 24/7 accessibility and free services, making them appealing for those lacking resources or time for traditional therapy, Fulmer noted. However, he stressed that users need to understand chatbots’ limitations and advised against their unsupervised use by minors or vulnerable individuals.
Character.AI, a chatbot company, is facing lawsuits alleging it provided harmful content to minors, including promoting self-harm. One Florida mother attributed her son’s suicide to the platform’s influence. The company has implemented safety measures, such as directing users discussing self-harm to external resources, but declined to comment on ongoing legal cases.
Chatbots vs. Human Therapists
In May 2023, Dr. Daniel Kimmel, a psychiatrist at Columbia University, tested ChatGPT therapy by presenting the chatbot with a hypothetical patient and comparing its responses to his own approach. He noted that the chatbot did well in sounding like a therapist, offering validation and general, accurate advice. However, Kimmel pointed out that the chatbot lacked the human element of curiosity, which is crucial for therapists to probe deeper into a patient’s responses and uncover underlying issues.
Kimmel explained that therapists juggle multiple tasks during sessions: listening attentively, connecting current statements to previous ones, and applying their professional knowledge to provide helpful insights. If chatbots fail to do these things, they risk offering advice that might not be suitable or well-received, he cautioned.
Furthermore, Kimmel highlighted the importance of confidentiality in therapy, noting that interactions with human therapists are protected under HIPAA, whereas chatbots often do not comply with these privacy laws. Users are frequently advised against sharing sensitive information with chatbots.
Kimmel emphasized the need for further research to explore AI chatbots’ role in mental health. He acknowledged that AI technology is not going away and could have significant potential.
Dunham, who finds ChatGPT helpful for self-reflection without direct human interaction, believes such technology could benefit introverted individuals who may feel more comfortable sharing their thoughts with a bot. She urged people to prioritize their mental health, regardless of the therapy format, and not judge others’ healing methods.