3 Ways AI Could Aid Behavioral Health Screenings

3 Ways AI Could Aid Behavioral Health Screenings. A woman suffering from behavioral health issues is assisted by an artificial intelligence (AI) bot.

Using artificial intelligence (AI) to supplement traditional behavioral health screenings is gaining momentum in primary care.

Some areas being explored include: predicting risks among adolescents that they could experience mental illness; reducing readmissions by screening patients and treating if they test positive for opioid-use disorder; and implementing AI therapy chatbots to supplement cognitive therapy.

The tools typically are designed to address common challenges providers face, such as improving efficiency and workforce shortages. And while many of these applications can extend access to care, they are not a replacement for providers. Here are several recent developments that caught our eye.

Predicting Risks, Potential Causes of Adolescent Mental Illness, Brain.1 | Predicting Risks, Potential Causes of Adolescent Mental Illness

An AI model, developed by Duke Health researchers, accurately predicted when adolescents were at high risk for future serious mental health issues before symptoms become severe.

Unlike prior models that primarily rely on existing symptoms, the AI model identified underlying causes, such as sleep disturbances and family conflict, as indicators to prescribe preventive interventions. The capability to identify early warning signs and proactively intervene with prophylactic treatments could greatly expand access to mental health services, with assessments and care available through primary care providers, researchers said.

The AI model could be used in primary care settings, enabling pediatricians and other providers to know immediately whether the child in front of them is at high risk and empowering them to intervene before symptoms escalate, notes Jonathan Posner, M.D., professor of psychiatry and behavioral sciences at Duke and senior author of a study published recently in Nature Medicine.

Posner and colleagues analyzed psychosocial and neurobiological factors associated with mental illness using data from an ongoing study that conducted psychosocial and brain development assessments of more than 11,000 children over five years.

Using AI, the researchers built a neural network — a model that mimics brain connections — to predict which children would transition from lower to higher psychiatric risk within a year. That model then is used to score a questionnaire that ranks responses from the patient or parent about current behaviors, feelings and symptoms, to predict the likelihood of an escalation.

Takeaway

The model was 84% accurate in identifying patients in the study whose illness escalated within the next year, the study found. Duke researchers analyzed an alternative model that identified the potential mechanisms that might lead to or trigger worsening mental illness. With an accuracy rate of 75%, the new modeling system’s ability to identify underlying causes can alert doctors and families to potential interventions, researchers conclude.

Reducing Readmissions by Screening for Opioid-Use Disorder. Opioid pills and capsules.2 | Reducing Readmissions by Screening for Opioid-Use Disorder

The National Institutes of Health April 3 released a study that found that an AI intelligence screening tool was as effective as health care providers in identifying hospitalized adults at risk for opioid-use disorder and referring them to inpatient addiction specialists.

When compared with patients who received consultations with providers, patients screened by AI had 47% lower odds of hospital readmission within 30 days after their initial discharge, saving nearly $109,000 in care costs.

The study, published in Nature Medicine, reports the results of a completed clinical trial, demonstrating AI’s potential to affect patient outcomes in real-world health care settings. The study suggests that investment in AI may be a promising strategy specifically for health systems seeking to increase access to addiction treatment while improving efficiencies and saving costs.

The AI screener was built to recognize patterns in data, like how our brains process visual information. It analyzed information within all the documentation available in the electronic health records in real time, such as clinical notes and medical history, to identify features and patterns associated with opioid-use disorder. Upon identification, the system issued an alert to providers when they opened patients’ medical charts with recommendations to order addiction medicine consultation and to monitor and treat withdrawal symptoms.

Takeaway

The trial found that AI-prompted consultation was as effective as provider-initiated consultation, ensuring no decrease in quality while offering a more scalable and automated approach. Specifically, the study showed that 1.51% of hospitalized adults received an addiction medicine consultation when health care professionals used the AI screening tool, compared with 1.35% without the assistance of the AI tool. Additionally, the AI screener was associated with fewer 30-day readmissions, with approximately 8% of hospitalized adults in the AI screening group being readmitted to the hospital, compared with 14% in the traditional provider-led group.

Deploying AI Therapy Chatbots vs. Standard Cognitive Therapy. Chatbot.3 | Deploying AI Therapy Chatbots vs. Standard Cognitive Therapy

Generative AI (GenAI) chatbots hold promise for building highly personalized, effective mental health treatments at scale, while also addressing user engagement and retention issues common among digital therapeutics, notes a recent NEJM AI study.

The randomized-control trial study of Therabot by Dartmouth College researchers found “significantly greater reductions of symptoms” for major depressive disorder, generalized anxiety disorder and those at high risk for eating disorders.

Trial participants felt they could trust the therapy chatbot to a degree that was comparable to working with a real therapist, notes a press release from Dartmouth.

Takeaway

Fine-tuned GenAI chatbots offer a feasible approach to delivering personalized mental health interventions at scale, but further research with larger clinical samples is needed to confirm their effectiveness and generalizability, the study notes.

Michael Heinz, M.D., the study’s first author and an assistant professor of psychiatry at the Dartmouth College Center for Technology and Behavioral Health and the Geisel School of Medicine, said that “no generative AI agent is ready to operate fully autonomously in mental health.” He highlighted, “We still need to better understand and quantify the risks associated with generative AI used in mental health contexts.”


Learn More

Visit the AHA Behavioral Health website to access a wealth of resources, including reports on child and adolescent mental health, rural behavioral health issues and more. Also, read the AHA Insights report “Building and Implementing an Artificial Intelligence Action Plan for Health Care” for information on how AI can transform your operations.

AHA Center for Health Innovation logo