Experts' Opinion on Using ChatGPT as a Substitute for Therapy

14 May 2023 1848
Share Tweet

Artificial intelligence (AI) is currently capturing a lot of attention in various fields. With the popularity of AI-powered chatbots like ChatGPT, people are using the technology to perform tasks such as answering math questions, translating phrases, and even generating grocery lists and recipe ideas.

Some people are also employing these AI chatbots for mental health support. By asking ChatGPT questions about their mental health, individuals can receive advice, sometimes for free, without having to pay for therapy sessions or spend time searching for a therapist.

Some social media users have even claimed to have replaced their therapist with an AI chatbot like ChatGPT. However, healthcare providers specializing in mental health find this advice concerning.

Bruce Arnow, PhD, a professor at the department of psychiatry, associate chair, and chief psychologist at Stanford University, warns users to be skeptical. "[AI chatbots] are not meant to be used as a substitute for therapy, psychotherapy, or any kind of psychiatric intervention," he told Health. "They're just not far enough along for that, and we don't know if they'll ever be."

Psychologists are also wary of using AI as a therapist. While AI chatbots can offer sound advice, accurately directing users to resources, and providing therapist-like responses to queries about mental health, these bots have not undergone the necessary training and licensing as a human therapist. As such, there are concerns surrounding a lack of accountability and safety in the practice.

Moreover, using AI as a therapist also makes privacy an issue for some individuals, as sensitive information is put online. ChatGPT, for instance, records conversations for the purpose of improving the AI system, so users should consider this before deciding to share personal information with the chatbot.

Uwamahoro Williams is also concerned that advice from a chatbot could be misinterpreted by the person seeking help, which could make things worse in the long run.

All of these qualms, however, can really be traced back to one main issue, namely that AI is just that—artificial.

“I think in the future it's going to probably surpass us—even therapists—in many measurable ways. But one thing it cannot do is be a human being,” Russel Fulmer, PhD, senior associate professor at Xi’an Jiaotong-Liverpool University and incoming professor and director of counseling at Husson University, told Health. “The therapeutic relationship is a really big factor. That accounts for a lot of the positive change that we see.”

Traditional therapy allows the provider and patient to build an emotional bond, as well as clearly outline the goals of therapy, Arnow explained.

“AI does a really good job in gathering a lot of knowledge across a continuum,” Uwamahoro Williams said. “At this time, it doesn't have the capacity to know you specifically as a unique individual and what your specific, unique needs are.”

Though psychologists largely agree that using AI as a stand-in for a therapist isn’t safe, they diverge a bit on when and if the technology could ever be useful.

Arnow is a bit skeptical as to whether AI chatbots could ever be advanced enough to provide help on the same level as a human therapist. But Fulmer and Uwamahoro Williams are a bit more comfortable with the idea of chatbots potentially being used in addition to traditional therapy.

“These platforms can be used as a supplement to the work that you’re actively doing with a professional mental health provider,” Uwamahoro Williams said.

Chatting with an AI could even be thought of as another tool to further work outside of therapy, similar to journaling or meditation apps, she added.

There are even some chatbot AIs that are being piloted specifically for mental health purposes, such as Woebot Health or Elomia. It’s possible that these could be a better option since they’re created specifically for handling mental health-related queries.

For example, Elomia says they have a safety feature where humans will step in if people need to speak to a real therapist or a hotline, and Woebot says their AI has a foundation in “clinically tested therapeutic approaches.”

Most of these programs—in addition to AI in general—are still being developed and piloted though, so it’s probably too early to compare them definitively, Fulmer said.

Online AI therapy certainly holds no candle to the real thing—at least for now—Fulmer and Arnow agreed. But the fact remains that mental health care is inaccessible for many people—therapy can be incredibly expensive, many therapists don’t have space for new clients, and persistent stigma all dissuade people from getting the help they need.

“I guess there’s a difference between my ideals and the recognition of reality,” Fulmer said. “ChatGPT and some of these chatbots, they offer a scalable solution that’s, in many cases, relatively low-cost. And they can be a piece of the puzzle. And many people are getting some benefit from them.”

If just one person has received some sort of benefit from treating AI as a therapist, then the notion of whether it could work is at least worth considering, Fulmer added.

For now, ChatGPT may have useful applications in helping people “screen” themselves for mental health disorders, experts said. The bot could guide someone through common symptoms to help them decide if they need professional help or diagnosis.

AI could also help train new counselors and help psychologists learn more about which strategies are most effective, Arnow and Uwamahoro Williams said.

Years down the line as AI advances, it may have more applications in therapy, Fulmer said, but it still may not be right for everyone.

“Right now, there is no substitute for a human counselor,” Fulmer said. “[AI] synthesizes large data sets, it’s good with offering some useful information. But only a real-life therapist can get to know you and tailor their responses and their presence to you.”

 


RELATED ARTICLES