Will AI Help Address Our Behavioral Health Crisis?

Will AI Help Address Our Behavioral Health Crisis? A human head open at the top with a pair of tweezers inserting a microchip into the brain cavity.

It is still early in its evolution, but artificial intelligence (AI) is assisting some clinicians in the way they diagnose and provide therapy for behavioral health patients.

The hope is that AI may be able to help providers improve access for the growing number of patients who need care. The applications, both home-grown in health systems and those from innovative startups, are drawing interest from researchers, payers and investors alike.

Applying AI in Therapy

At Cedars-Sinai Medical Center in Los Angeles, for example, physician-scientists are studying what they say is a first-of-its-kind program that uses virtual reality and generative AI to provide mental health support.

Known as eXtended-Reality Artificially Intelligent Ally, or XAIA, the program offers users an immersive therapy session led by a trained digital avatar.

May Mental Health MonthXAIA was created to offer patients self-administered, AI-enabled, conversational therapy in relaxing virtual reality environments — such as a creekside meadow or a sunny beach retreat where patients also can engage in deep-breathing exercises and meditation. It employs a large language model (LLM) programmed to resemble a human therapist.

Expert mock therapy sessions were performed and transcribed to hear firsthand how a trained psychologist can — and should — interact with patients. From these sessions, recurring exchanges and patterns were identified and encoded into the LLM as rules and preferred responses for XAIA to emulate. The findings include more than 70 best practices for mental health therapy.

A report published in January in the research journal Nature found that study group patients accepted the technology and that it was a safe form of AI psychotherapy that warrants further research.

Early Identification for Better Outcomes

At the Cincinnati Children’s Hospital Decoding Mental Health Center, researchers are working with scientists from the University of Cincinnati, University of Colorado and Oak Ridge (Tennessee) National Laboratory to use AI to identify the risk of anxiety, depression and suicide in children and teens.

The scientists are developing tools that combine information routinely collected by physicians and then use AI and natural language processing (NLP) to sift through the data to identify kids at greatest risk of developing a mental illness for earlier care and intervention. The team has developed software that learns from unstructured data (written texts or spoken words) to identify key features described in the health record and other data.

Meanwhile, the British startup Limbic has delivered AI-powered psychological assessment and triage tools in large-scale clinical settings in the U.K.’s National Health Service. The company reports 45% fewer changes in treatment, due to increased triage accuracy.

The company’s e-triage tool claims 93% accuracy across the eight most common mental health disorders, including depression, anxiety and post-traumatic stress disorder. Limbic Access augments clinician assessments, saving 12.7 minutes per referral and reducing wait times for screening and treatment. The company plans to expand in the U.S.

Supporting Overstretched Clinicians

Elsewhere, the Boston-based behavioral health startup Eleos Health uses AI, voice analysis and NLP to improve patient outcomes and workflow efficiency. The company is partnering with the National Council for Mental Wellbeing to increase clinician access to AI education, research and provider-centric tools.

Used by behavioral health organizations in 29 states, Eleos’ clinically validated technology has been proven to reduce provider documentation time by more than 50%, double client engagement and drive three to four times better care outcomes, the company states.

4 Challenges for the Future of AI Deployment in Behavioral Health

1 | In any application of AI, especially direct clinical interactions, does the clinician feel comfortable with the level of human oversight for the treatment or interaction?

Is the human “in the loop,” and do they feel that the potential benefits to the care clearly outweigh the risks associated with using AI in these cases?

2 | Will patients share their deepest and most personal feelings with an avatar or chatbot?

3 | Can AI overcome bias?

A World Health Organization report from last year found that there are still significant gaps in our understanding of how AI is applied in mental health care. The report cited flaws in how existing AI health care applications process data and insufficient evaluation of the risks around bias.

4 | How will AI address subjective judgment?

Diagnosing behavioral health issues often requires more subjective judgment compared with diagnosing physical conditions. Decisions must be based on the self-reported feelings of patients rather than medical test data, noted futurist Bernard Marr in a Forbes report in July. This could lead to more uncertainty around diagnosis and the need for careful monitoring and follow-up to ensure the best patient outcomes.

Learn More

The AHA Behavioral Health website provides a wealth of resources about child and adolescent health and the AHA’s behavioral health strategic priorities.

AHA Center for Health Innovation logo