The landscape of behavioral health is always changing, but two problems seem to stick around perpetually:
- workforce shortages, and
- the ever-growing administrative burden on providers.
In our recent webinar with the National Council for Mental Wellbeing, industry leaders came together to explore how artificial intelligence (AI) is turning the tide on both.
As this panel of experts explained, AI can streamline documentation, enhance clinical decision-making, and even transform how behavioral health professionals do their jobs—all without disrupting or replacing the crucial human connection inherent to mental healthcare.
These experts included:
Amanda Rankin, a licensed clinical social worker and Customer Insights Lead at Eleos, who shared her on-the-ground experience working with clinicians to design AI tools that actually make a difference.
Matt Spencer, Chief Clinical Officer at GRAND Mental Health, who offered real-world examples of how AI has streamlined workflows and saved valuable time in his organization.
Dennis Morrison, PhD, a clinical consultant and psychologist, who provided insights on the ethical implications of AI and its potential to augment—rather than replace—human expertise in the field.
You can watch the full webinar recording here—or keep reading for a quick summary of key takeaways!
AI as a Partner in Care (Not a Replacement)
While the webinar focused on leveraging AI to support the behavioral health workforce, another theme kept popping up: the abilities and limitations of AI technology.
Artificial intelligence is becoming more common in healthcare—including behavioral health—but it’s important to understand its role in the care process. AI isn’t equipped to take over for therapists or other care providers. It’s a tool meant to assist those providers in their work—not replace their expertise.
As Morrison explained during the webinar, “The American Medical Association has actually said, every instance of artificial intelligence in healthcare serves to advise clinicians…it’s always the clinician making the final call.” In other words, AI can suggest next steps or flag important details, but care decisions still belong with the clinician.
Rankin also spoke to this point, saying, “AI is not here to replace anybody—it’s more of a support to unburden them from some of those manual workflows and processes.” So, while AI certainly helps reduce administrative tasks—giving clinicians more time to focus on care delivery—it can’t do the care delivery itself.
Ultimately, AI complements the work clinicians do. It helps them do their jobs more efficiently, but they, as human clinical experts, are ultimately responsible for all care decisions—just as they’ve always been.
AI as a Detail-Noticer
Even in one-on-one therapy sessions, the amount of information being exchanged between the provider and the client can be overwhelming. Emotional cues, recurring themes, and essential details for documentation are all happening at the same time.
Bringing in another person to help track everything would be helpful—but therapists can’t do that without compromising client confidentiality and disrupting the session. But AI can analyze the conversation invisibly, often identifying key moments that might otherwise go unnoticed.
During the webinar, Morrison highlighted a real-life case that illustrates this perfectly, recalling a provider who realized—after reviewing an AI summary of a client session—that the client had referenced grief multiple times. The therapist had missed this pattern in-session, but thanks to the AI analysis, she referred the client to the necessary grief counseling services.
By drawing attention to these kinds of details, AI helps therapists make sure nothing slips through the cracks, even in the most complex or fast-paced sessions.
AI as a Burnout Reducer
Not only is AI good for picking up on clinical details; it can also help take over some of the more tedious responsibilities that come with clinical work. The experts on the panel emphasized the extensive time and energy providers typically spend on paperwork—adding to their stress and fatigue.
“The documentation requirements in our industry are just onerous,” Morrison explained. “I mean, they just are—more so than almost any other aspect of healthcare that I’m aware of.”
In fact, according to a recent National Council survey, 33% of providers spend the majority of their time on administrative tasks. Furthermore, 43% say they must work extra hours to meet those demands—and 47% say most of the admin work they do feels unnecessary.
All of that leads to burnout, causing many providers to quit their jobs—or leave the industry altogether. By decreasing the time providers spend on such energy-draining administrative tasks, AI is helping to improve retention rates across behavioral health workforces.
Spencer shared that since implementing the Eleos AI platform at GRAND, time spent on admin work has plummeted. “Most of our sessions are being documented in under 3 minutes now, down from the typical industry standard of about 15 minutes, which is just phenomenal,” he said. This allows clinicians to focus more on client care and less on paperwork, which in turn contributes to a more sustainable work environment.
Spencer pointed out that this reduction in administrative time doesn’t just help clinicians; it also contributes to better client care. “It either shortens the amount of session time needed and gives more time back to the clinician, or it gives you more time to focus on the current session that you’re in,” he said.
Learn more about GRAND Mental Health’s time savings with Eleos.
AI as a Compliance Enabler
AI is also making waves in compliance, changing how organizations manage and audit the quality of their clinical documentation. Spencer shared how AI has shifted GRAND’s compliance management process.
“Historically, we have auditors that work internally with us, and their goal is to audit between 5% and 10% of our charts,” he said. “Now, with the utilization of AI, we’re able to audit 100% of the progress notes… That CQI team that we have, they are now… able to work with that staff and their leaders … to actually heal and improve the documentation.”
By switching to this model, organizations can catch more compliance issues in a more timely manner—and free up more time to proactively help staff improve their documentation. AI can review every single note for technical details, so human auditors can focus on actually improving note quality across the org—not just catching issues after they happen.
On top of that, AI note-writing tools like Eleos help ensure provider documentation is compliant from the start by generating complete, thorough, clinically relevant notes that are unique to each individual client session.
AI as a Fully-Embraced Organizational Tool
Implementing a new AI solution isn’t as simple as flipping a switch. The panelists agreed that for AI to truly support staff and clients, it needs to be introduced thoughtfully. This means early preparation, solid training, and involvement of key team members from the start.
Rankin stressed the importance of ensuring staff are well-trained and well-informed. “Make sure your folks know about this well in advance,” she said. “The day they show up for training should not be the first day they hear about the rollout of AI.”
Rankin and the other speakers said that getting early adopters and internal champions on board quickly is one of the most effective ways to help a team take full advantage of AI. Staff advocates smooth out the adoption process by showing their peers how AI can actually lighten workloads, making it easier for everyone to embrace the technology as a helpful tool (instead of just another task to manage).
Get your complete guide to behavioral health AI implementation.
AI and Ethics
In addition to talking about what AI technology can do, the panelists devoted a good portion of the conversation to explaining what it should do. After all, AI doesn’t come without ethical concerns. Bias can emerge within algorithms, subtly shaping clinical feedback in ways that even the most experienced providers might miss.
“You may have individuals who are underserved in the data,” Spencer said. “The approach that the AI is heading towards may be different than what’s actually accurate for that client or clinician.”
AI’s blind spots aren’t always obvious, but they are real. And these blind spots can be particularly damaging for marginalized groups whose experiences might not be adequately represented in the datasets the AI draws from. This means providers need to stay vigilant and be aware that the data streamlining their workflows may not always capture the nuances of a client’s story.
AI and the Future of Behavioral Health
The potential for AI in behavioral health extends way beyond documentation. As Rankin noted, AI’s ability to process large amounts of data presents opportunities for analyzing trends more deeply, flagging risks, and improving care on a broader scale.
“AI is going to be better at reviewing large amounts of information and looking for trends,” she explained. “I think this is where AI is really going to excel as we move forward.”
By identifying patterns at a population level, AI could significantly change the way therapists plan and deliver care. It’s just one more way AI can augment the work of human providers. Ultimately, the true value of AI comes from its ability to handle the complexities of data and free clinicians to do what they do best—help people.
While workforce shortages and provider burnout continue to plague the behavioral health industry, AI offers a new ray of hope.
Want to get even more expert-tested ideas on how to solve workforce issues in your organization? Download your free copy of our Guide to Building a “Best Place to Work” in Behavioral Health.