A decade ago, the idea of AI helping therapists might have sounded like fantasy—or perhaps science fiction. Even as recently as 2020, technology in the therapy room was limited to basic scheduling tools and lackluster electronic health records (EHRs). But today, cutting-edge AI technology is completely transforming the way therapists work, effortlessly handling the kinds of tasks that once ate into evenings and weekends—almost like an always-available, never-tired virtual assistant.
“AI isn’t just a robot following instructions,” Samuel Jefroykin, Director of Data & AI Research at Eleos, explained in a recent episode of the No Notes podcast. “It’s more like a partner you can teach, one that learns and helps you do your work better.”
While chatting with host Dr. Denny Morrison, Jefroykin unpacked how behavioral health AI tools are already reducing late nights spent on documentation and creating more time and space for therapists to focus on clients. Their discussion broke down how AI works, what it can and can’t do, and the ways it’s helping clinicians work smarter, not harder.
Didn’t have time to tune in? No worries—keep reading for the highlights!
Want to listen to the AI 101 podcast episode in full? Check it out here.
What is AI? How is it different from automation?
Familiar therapy technology—like scheduling tools and EHRs—run on systems that leverage simple automation, following rigid if/then rules. This functionality is pretty much set, regardless of individual usage patterns. AI, on the other hand, is more nuanced and flexible. It can actually get better over time, because it learns as you use it.
“Back in the day, we tried a lot to make everything automatic: ‘If this happens, then that happens,’” Jefroykin explained. “The key component in artificial intelligence is what we call machine learning. We created an algorithm that can replicate humans by learning from them…we show an example, explain how to do something, and then the machine can reproduce it.”
Unlike older systems that give the same result every time for any given action or input, AI learns and changes with experience. As Morrison explained, “[With traditional software], if I had the same session with the same client and we talked about the same things, the output would be identical the second time…[but] it never is in the case of artificial intelligence.”
Artificial Intelligence vs. Augmented Intelligence
At Eleos, we firmly believe behavioral health AI cannot replace behavioral health providers. Their work is simply too complex—and too entrenched in the intricate nuances of the human body and mind. Instead, our tools are built with the goal of enhancing clinicians’ work—a distinction encompassed by the term augmented intelligence.
“Artificial intelligence is to replicate a task by a human,” Jefroykin explained. “So you can use it to replace a human doing a task, or you can augment the human while they are doing their task.”
Augmented intelligence, he continued, “gives the right tools for you, with this assistant, with this augmented AI system, to create the best notes together. So you take the best from both.”
Morrison expanded on this human-machine partnership, stressing that these tools exist to help providers do better clinical work, not to do that work for them: “Everything that Eleos does, and I think pretty much the whole industry does, is to help clinicians do their jobs better. It gives them information so that they can make better decisions. It’s not making decisions for them.”
Jefroykin compared AI’s role to the introduction of calculators for mathematicians. “You know that when mathematicians got calculators or got a computer, they were augmented by the machine,” he said. “Because we gave them an assistant—something that can do a lot of tasks for them.”
In the case of behavioral health AI, those tasks are the mundane operational and administrative to-dos that often prevent therapists from giving their full attention and focus to client care. The goal, as Morrison put it, is simple: “People didn’t get into this field because they like to write notes. They got into this business because they wanted to help people. So we’re helping them help people more.”
Strengths and Limitations of AI in Behavioral Health
AI offers some impressive benefits, but it is not without its flaws. So, Jefroykin and Morrison parsed out the strengths and weaknesses of AI—because understanding both is crucial to making good use of these tools.
Strengths: AI Doesn’t Get Tired
One of AI’s biggest strengths is its consistency. Unlike humans, AI doesn’t lose focus or overlook subtle details—it’s always ready to notice patterns and flag insights.
“The AI system is in good shape 100% of the time,” Jefroykin explained. “It’s not sick. It’s not tired. It doesn’t have all those human factors that can lead you to miss something.”
AI also handles data at a scale no human could match. “AI systems can handle more therapy sessions than any therapist in their life,” Jefroykin said. This makes it easier for AI to spot patterns or details that might otherwise go unnoticed.
Morrison added that the extra perspective provided by AI is almost like having a colleague on hand. “It’s a lot like having another clinician in the room to give you some feedback from their perspective,” he said. And that second set of eyes can go a long way toward improving care quality.
Limits: AI Isn’t Perfect—or Human
Even with these strengths, AI has limits that prove exactly why it is merely a tool—not a replacement for human providers. “With AI, you live in a world of errors,” Jefroykin said. “There are always errors. Sometimes you don’t notice the error, or sometimes it’s really small.” And without a trained provider overseeing the software, those errors—including highly concerning issues like bias and hallucinations—can go unnoticed and unchallenged.
Morrison compared AI to psychological tests: “No test is perfect,” he said. “That’s why you have reliability and validity coefficients that tell you what the probability [of error] is.”
Then there’s the human side of therapy. AI isn’t capable of feeling empathy or understanding the small nuances that make each client unique—especially over the course of several therapy sessions. “How can AI scan the room and understand the mood of someone?” Jefroykin asked. “Sometimes it’s really small, because you need to meet the person 10 times to understand exactly the mood of this person, that something changed.”
Purpose-built AI vs. General Models
While AI’s potential is huge, not all AI systems are created equal. ChatGPT and other general AI tools might be fun to play with, but they’re not practical for clinical settings. “ChatGPT can help you rephrase. It can help you do a lot—[it does] really good content writing,” Jefroykin said. “But it’s not the best to analyze a session.”
Why? First and foremost, general models like ChatGPT aren’t HIPAA-compliant and can’t integrate seamlessly into a clinician’s workflow. “You have some PHI issues whenever you send [information] to a free account on ChatGPT,” Jefroykin explained. “It’s not HIPAA-compliant. That means it’s not great security-wise to share your client’s information within the system.”
In contrast, Eleos Health’s AI was explicitly designed for behavioral health, which means it meets the same privacy and security standards as any other therapy software—including EHRs and EMRs. “Eleos is built for clinical use,” Morrison explained. “It’s based on clinical transactions, the language of mental health. You don’t have to re-explain everything every time.”
Morrison also drew a parallel to another specialized field: radiology. “Would a radiologist use a general AI tool like ChatGPT and say, ‘Is this cancer or not?’ and trust it? No,” he said. “They’d use a purpose-built tool. It’s exactly the same for therapy.”
The Future of AI in Behavioral Health
Jefroykin and Morrison believe provider use of AI will shift dramatically as the technology continues to evolve and improve. Right now, most AI tools are reactive—helping with tasks after the fact, as a response to being asked or prompted (e.g., generating notes or analyzing sessions). But the future will bring AI tools that are much more proactive.
“With a passive AI system, you have to trigger it, and it answers,” Jefroykin said. “The next step is moving to an active system, where it analyzes what’s happening and then [automatically] takes action.”
Jefroykin painted a picture of AI becoming an active participant in the care process. “You’re having a session, you finish, and because the AI listened, it can write the notes, send them to your EHR, update your schedule, and send referrals without you asking,” he said.
Dr. Morrison described this potential future as “an EHR with no keyboard,” adding that providers “won’t even need to type or interact with the software; the AI will just know what to do.” In this type of environment, rather than having to navigate computer systems or manually enter data, therapists will simply rely on AI to take care of all those little administrative actions. “The AI becomes like an assistant that doesn’t need you to say, ‘Go do this.’ It understands what you need done and takes care of it, so you can focus on helping people,” Morrison said.
But even as we move closer to this future, one AI tenet that will always remain consistent is that AI won’t replace therapists—rather, it will help them work better and smarter.
For more great insights from Jefroykin and Morrison, tune into their full podcast episode here.
Curious how purpose-built AI can simplify your work and create more time for client connection? Request a demo of the Eleos platform here.