Monday, November 25, 2024
Articles

The Chatbot Will See You Now?

AI-powered mental health apps, featuring friendly bots ready to chat up self-help mental health strategies, are gaining traction. Often referred to as “coaches,” these chatbots communicate via instant messaging, texting, and even virtual conversation. Some apps are free, some subscription, some a combo, and all are still evolving.

At this point, particularly if you’re a PCP, you may be asking, “Why am I reading this? Mental health isn’t in my wheelhouse!”

A Harvard Business Review article says otherwise. Those seeking mental healthcare are turning more than ever to emergency rooms and PCPs for treatment. “Patients with depression, for instance, see their primary care physicians more than five times on average annually, versus fewer than three times for those without depression.”1

So, you probably do – or will soon – have patients turning to you with mental health issues and perhaps asking questions about the increasing chatbot population. To keep everyone on the same page, let’s review a few of the AI-powered mental health apps that patients may ask about. 

Four With a Following  

Woebot, described as a “charming robot friend,” is a chatbot that communicates through instant messaging. The technology, developed by several psychologists from Stanford University, is programmed to replicate conversations you might have with a human therapist. For instance, Woebot might ask about your mood and thoughts, “listen” to how you feel, learn about you, and offer evidence-based cognitive behavior therapy (CBT) tools.2

Tess, developed by X2AI, is a texting buddy who declares, “I am not a person, but am a chatbot trained by experts and have a couple of cool tools up my sleeve to help you feel better.” One tool prompts Tess to text precisely when you’re most likely to suffer a panic attack and ask how you’re doing. She also provides coping mechanisms, encouraging words, and clinically proven mental health education sources.3  

Ellie is a virtual human – a 3D avatar – that you interact with on a monitor, as if it’s a face-to-face conversation. Her facial, body, and voice recognition software detects nonverbal emotional and behavioral cues, which is used to screen patients but never treat. In fact, Ellie studies 66 points on your face. The Institute of Creative Technology, part of The University of Southern California, originally developed Ellie to help screen veterans for PTSD. Today, she’s branching out. How far Ellie can go is yet to be determined.4

A Cautionary Tale

Surveys and studies suggest that AI mental health technology results aren’t half bad. That said, the long-term efficacy of mental health AI has yet to be satisfactorily tested and more importantly, we don’t yet know if there could be any negative effects, warns Vaile Wright, PhD, licensed psychologist and director of research and special projects at the American Psychological Association. “This technology has a lot of promise, but it’s outpacing our research. For example, one of our concerns is how a chatbot would handle someone in a crisis. If someone is suicidal and expresses this to a chatbot, what would happen? I know these apps have disclaimers, stating that the chatbot is not a therapist, but these warnings are often in the fine print. Let’s be clear – most of us don’t read the fine print. So could someone’s dependence on a chatbot ultimately cause harm? We just don’t know yet.”

Proof Positive

When using AI-powered mental health apps properly and as intended, Dr. Wright does feel there’s positive news to report. “The best research we have right now tells us that these apps can be beneficial when used as part of an established relationship between a patient and mental health expert,” Dr. Wright explains. “Many of these apps can augment therapy by helping individuals think about goals and how to achieve them when they’re not in that one hour session with their therapist.”

In addition, a few studies point to these apps’ benefits. Let’s circle back and consider research attached to the four examples above.

• A study, including 129 volunteer Wysa users, collected data specific to those who self-reported symptoms of depression. Based on app usage between two consecutive screening time points, two groups of users emerged – high users and low users. Sixty-eight percent of participants in the high-user group found the app experience helpful and encouraging when compared to the low-user group.5

• A 2017 clinical trial of Woebot found that participants who interacted with the app for two weeks and up to 20 sessions reported almost a 20 percent reduction in anxiety symptoms as compared to a control group of participants directed to read mental health self-help material. The study concluded: “Conversational agents appear to be a feasible, engaging, and effective way to deliver CBT.”6

• A study published in December 2018 assessed the feasibility and efficacy of using the integrative psychological AI app Tess to reduce self-identified symptoms of depression and anxiety in college students. This study included 75 participants from 15 universities across the United States. The study’s authors conclude that AI can serve as a cost-effective and accessible therapeutic agent – although no AI-powered mental health technology has yet been invented that can replace the role of a trained therapist.7

• In a research project with soldiers who recently returned from Afghanistan, Ellie uncovered more evidence of PTSD than the Post-Deployment Health Assessment administered by the military. In addition, the study found that Ellie was able to identify certain nonverbal “tells” common to individuals suffering from PTSD.1

There’s no doubt that chatbots and virtual therapists are proving their worth in the AI revolution in mental health. And every healthcare provider should at least have a clear picture of what these apps are and how far they can go. At the same time caution is advised. These are tools to assist the mental health specialist – not replace them. 

Resources

1) Glick, P. G. S. (2018, October 24). AI’s Potential to Diagnose and Treat Mental Illness.

2) Your charming robot friend who is here for you, 24/7. (0AD).

3) For individuals. (0AD).

4) Carter, K. (2019, January 9). How Computer-Assisted Therapy Helps Patients and Practitioners (Pt. 1).

5) Wysa. (0AD).

6) Fitzpatrick, K. K., Darcy, A., & Vierhile, M. (2017). Delivering Cognitive Behavior Therapy to Young Adults With Symptoms of Depression and Anxiety Using a Fully Automated Conversational Agent (Woebot): A Randomized Controlled Trial. JMIR mental health4(2), e19. doi:10.2196/mental.7785.

7) Fulmer, R., Joerin, A., Gentile, B., Lakerink, L., & Rauws, M. (2018). Using Psychological Artificial Intelligence (Tess) to Relieve Symptoms of Depression and Anxiety: Randomized Controlled Trial. JMIR mental health5(4), e64. doi:10.2196/mental.9782

Leave a Reply