Many Are Using Chatbots for Therapy—But Is It Safe?

This is a free excerpt from one of Shortform’s Articles. We give you all the important information you need to know about current events and more.

Don't miss out on the whole story. Sign up for a free trial here .

Why are people turning to chatbots for therapy? Can chatbots be better than human therapists?

As the mental health crisis grows, many people are wondering whether they can turn to a chatbot for support. Some have decided the answer is “yes”—but it’s unclear just how safe or effective it is to rely on a chatbot instead of a therapist.

Continue reading to learn more about AI chatbot therapy.

If you’ve ever experimented with a chatbot like ChatGPT, then you’ve seen firsthand how convincingly they can approximate human conversation. Using a form of artificial intelligence (AI) called a large language model, chatbots can field questions and produce answers that sound a lot like human speech. 

So users have naturally turned to tools like ChatGPT for help coping with common mental health struggles, like anxiety and depression. Here’s a look at the discussions around chatbot therapy.

Can a Chatbot Stand in for a Therapist? 

While ChatGPT is new, the idea that chatbots might eventually stand in for therapists is actually decades old. As far back as the 1960s, scientists have experimented with computer programs intended first to parody and then to simulate conversations between a therapist and a patient. New programs like ChatGPT are more effective than older types of artificial intelligence at parsing people’s questions and responding in a way that feels human.

Yet, users report that over longer conversations, the exchange can feel “empty” or “superficial.” That might be because chatbots only pretend to have empathy, understanding, and other human traits. In therapy, the factor that correlates most strongly with good outcomes is the strength of the relationship between therapist and patient. Critics say that chatbots can’t replicate this relationship.

Traditional Therapy Isn’t Perfect, But Are Chatbots Better?

Human therapists are also fallible and biased. Many people worry about sharing their thoughts with another person. Some feel intimidated by the idea of seeking out mental health care. Research has shown that some people report feeling that they’ve developed a rapport with a chatbot and come to trust it.

People are already turning to a variety of chatbots for mental health support. One popular option is Woebot, which interprets what users say and answers with prewritten responses based on techniques of cognitive behavioral therapy (CBT). In CBT, a therapist works to help a patient replace counterproductive thought patterns with patterns that help them respond more productively. Woebot aims to do the same but with the convenience and immediacy of an app.

Yet even those who are enthusiastic about using AI to support mental health advise against replacing therapy with a chatbot. Chaitali Sinha, head of clinical development and research at Wysa, explains that chatbots “should not be used as a replacement for traditional forms of therapy.” Similarly, Athena Robinson, chief clinical officer at Woebot, characterizes the app as a “guided self-help ally,” not a therapist.

Is Something Better Than Nothing?  

Advocates say that AI therapy offers one of the few practical answers to a vast unmet need. Many people talk to chatbots like Woebot and Happify in part because symptoms of anxiety and depression are common and access to mental health care is fragmented. The National Institute of Mental Health (NIMH) estimates that 1 in 5 Americans lives with mental illness. Yet in 2021, fewer than half (47.2%) of such people received treatment.

Some barriers to treatment are geographic: 75% of rural counties across the US have either no psychologists, psychiatrists, or counselors at all, or they have fewer than 50 per 100,000 people. That leaves many Americans unsure of where or how to access care. Other barriers are economic: High costs and insufficient insurance coverage—or insurers’ improper denial of coverage—leave people struggling to afford care.

On top of insufficient access to care, common treatments have been criticized as inadequate, too. Research has questioned the efficacy of selective serotonin reuptake inhibitors (SSRIs), which treat depression by raising the brain’s levels of serotonin. Researchers have learned that depression is more complex than a “chemical imbalance,” and a recent review found that low serotonin levels aren’t associated with depression at all. This doesn’t mean that SSRIs don’t work: The drugs still have an edge over placebos. But this advantage is small.

It seems rational for people to seek support in handling the heavy burden of these challenges wherever they can get that support. But it remains to be seen whether chatbots, without any human emotions or experiences of their own, can really help us to carry that weight. 

Many Are Using Chatbots for Therapy—But Is It Safe?

Want to fast-track your learning? With Shortform, you’ll gain insights you won't find anywhere else .

Here's what you’ll get when you sign up for Shortform :

  • Complicated ideas explained in simple and concise ways
  • Smart analysis that connects what you’re reading to other key concepts
  • Writing with zero fluff because we know how important your time is

Hannah Aster

Hannah graduated summa cum laude with a degree in English and double minors in Professional Writing and Creative Writing. She grew up reading books like Harry Potter and His Dark Materials and has always carried a passion for fiction. However, Hannah transitioned to non-fiction writing when she started her travel website in 2018 and now enjoys sharing travel guides and trying to inspire others to see the world.

Leave a Reply

Your email address will not be published.