Photo from HBR.
Growing interest in mental health chatbots is driven by AI advancements that enable more nuanced conversations. However, the extent to which they can provide therapy remains a question.
Despite advancements, chatbots cannot fully replace human therapists. Some of these tools struggle to assist patients in crisis and may not provide the personalized advice or guaranteed accuracy that humans can offer.
Researchers are exploring various supportive roles that chatbots and artificial intelligence could fulfill in mental health care. For example, chatbots show potential in assisting individuals in assessing their need for care, directing them to appropriate resources, boosting mood, and helping them practice skills taught in cognitive-behavioral therapy.
Olga Troyanskaya, a computer science professor and AI expert at Princeton University, who heads Princeton Precision Health, an interdisciplinary initiative focused on using technology to enhance healthcare, emphasizes the significant demand for innovation in mental health care. However, she also stresses the importance of maintaining a balance between enthusiasm and caution, particularly regarding the use of artificial intelligence in mental health.
Displaying Empathy
Increasing access to mental health care is essential. In the United States, an estimated one in five adults, approximately 60 million people, were reported to be living with a mental health condition in 2021, according to the National Institute of Mental Health. However, in numerous regions, there is a shortage of psychiatrists to meet the needs of the population, and the available providers often do not accept new patients.
Efforts to enhance access to mental health care through digital tools such as online resources and mobile apps have been ongoing for years. Research indicates that digital interventions can assist individuals experiencing common mood and anxiety symptoms. However, they are typically more effective when supplemented by human involvement, with someone checking in on users.
However, according to David Mohr, a professor at Northwestern University and director of the Center for Behavioral Intervention Technologies, some older mental health apps experienced high abandonment rates. This was partly due to their limited and repetitive responses.
Recent chatbots utilizing AI technology, built on large language models, can engage in more human-like conversations, thus presenting new opportunities for mental health care applications. For instance, some of these advanced tools can replicate human empathy and comforting responses so effectively that it becomes challenging to distinguish between human and AI-generated responses.
However, some experts express concerns regarding the absence of regulations, safety measures, and established best practices regarding the use of such technology for mental health purposes. They are especially worried about chatbots providing advice, noting that a suggestion helpful to one individual could potentially harm another. According to Princeton's Troyanskaya, these tools must undergo rigorous testing through high-quality research, and safety precautions need to be implemented in areas where there are potential safety risks.
One major concern is the use of untested technology by individuals during periods of crisis or acute distress. According to some researchers, experiments with generative AI chatbots have shown that some of these chatbots rapidly start providing advice, not all of which is appropriate or safe.
Adding a disclaimer about the limitations of a chatbot may not always be helpful. For instance, if a user asks for advice on dealing with thoughts of self-harm and the chatbot responds with, "I cannot help you with your question; please seek professional therapy," it may not effectively assist someone in crisis. Experts warn that individuals in crisis often find it challenging to ask for help, so turning them away at that moment could potentially lead to dangerous consequences or deter them from seeking help in the future.
Northwestern's Mohr believes we're not close to the stage where AI can effectively mimic a therapist's role. He points out that there are still too many potential pitfalls and uncertainties in this regard.
Technique Training
However, chatbots are starting to fulfill valuable roles in various mental health areas. One such role is guiding individuals who are reluctant to seek help towards accessible resources. Other roles include assisting individuals who are waiting to see a therapist or struggling to cope between therapy sessions.
Theresa Nguyen, a clinical social worker and Chief Research Officer at Mental Health America, a nonprofit organization providing free mental health tools, states, "I aim to alleviate distress during the period when individuals are seeking care and to equip them with tools to better prepare for therapy, thereby maximizing the benefits they receive."
Nguyen has collaborated with Tim Althoff, an assistant professor of computer science and engineering at the University of Washington in Seattle, and his team on a digital tool designed to assist people in learning and practicing "cognitive reframing," a skill commonly used in therapy.
The goal of cognitive reframing is for individuals to understand that their perception of a situation, not just the objective facts of the situation, affects their emotions. It has been demonstrated that changing or "reframing" this perspective can be beneficial in improving mood, including reducing symptoms of anxiety and depression. However, mastering this technique typically requires time, practice, and guidance.
The tool integrated into Mental Health America's website walks individuals through the process, providing suggestions on how they can alter their negative thoughts. Research conducted by the team has indicated that people find the tool beneficial for practicing cognitive reframing and report reduced anxiety levels after using it.
Becky Winegard, a 39-year-old paralegal from Grand Junction, Colorado, began using the tool last year when she started experiencing daily panic attacks and began taking medication for anxiety due to an issue with her child. Winegard confirmed that she had numerous spiraling thoughts and couldn't manage them.
Even though she was attending counseling sessions, she felt that the weekly meetings were insufficient. Upon discovering the tool on the Mental Health America website, she started using it to interrupt her spiraling thoughts. She would input her current worry, and the program would provide prompts and recommend techniques to help her reframe her thoughts, ultimately reducing her anxiety.
Winegard utilized the tool as many as six times a day for three months, until she reached a point where she felt confident in her ability to reframe her thoughts without assistance. It was incredibly helpful for Winegard. Now she no longer needs it; she can guide herself through the process.
Triage tool
Meanwhile, in the UK, an AI chatbot created by mental health company Limbic is serving as a triage tool for the National Health Service's Talking Therapies program, designed to assist individuals dealing with anxiety and depression. The chatbot guides users by asking them questions to direct them to the suitable mental health care service.
Mona Stylianou, the principal clinical lead at Everyturn Mental Health, a nonprofit organization offering care through the NHS, states that the tool has enhanced patient access to care by providing a 24/7 self-referral process and a personalized method for assessing patients' needs.
Stylianou noted that patients can now access the help they require quickly and easily. As a result, patient waiting times have improved, enabling patients to receive the appropriate assistance promptly.
Some experts suggest that artificial intelligence is also demonstrating potential in training mental health care providers. One example is Lyssn, a tool that utilizes AI-generated algorithms to assess recordings or transcripts of clinical sessions. It then offers feedback to clinicians regarding their performance on various metrics, such as the effectiveness of implementing specific therapy strategies.
It's crucial to be able to monitor and analyze the processes involved in psychotherapy," says Stephen Schueller, an associate professor of psychological science and informatics at the University of California, Irvine, and a licensed clinical psychologist. He suggests that AI-based tools could eventually assist nonprofessional workers, such as crisis-line volunteers, in utilizing evidence-based techniques.