When you’re excited about something new—whether it’s a tech solution or a relationship—it’s easy to ignore the red flags.
But as we all know, turning a blind eye to warning signs today sets you up for a world of heartache tomorrow.
To help you avoid an AI bad romance, we rounded up some of the most knowledgeable and experienced IT experts in behavioral health and asked for their advice on the must-haves—as well as the absolutely-nots—to look out for when evaluating potential AI partners.
Read on for a quick summary of what they had to say, and be sure to check out the full recording of their discussion here.
Red Flag #1: Barriers to Operationalizing the Technology
We’ve all been duped into buying some gadget or gizmo that promises to make our lives better, only to discover that using it is too complicated or inconvenient to be worthwhile—or that it doesn’t work the way we thought it would. (Personally, most of mine have ended up in the Cupboard of Misfit Kitchen Appliances.)
Because AI is a relatively new component of the tech stack at most behavioral health organizations, it’s really important to make sure you’re ready to operationalize it before you bring it into the fold. Your AI vendor should do a lot of that planning and mapping for you, especially when it comes to things like integration, setup, and user training. And ideally, they’ve built their software with operationalization in mind—by making it embeddable and EHR-agnostic, for example.
“From a pure IT perspective, alignment to existing tech is critical,” said Rony Gadiwalla, CIO of GRAND Mental Health. “It can make or break a project.”
Gadiwalla encourages behavioral health leaders to consider the long-term viability of the technology they choose to invest in—because in reality, very few systems can be operationalized and sustained beyond the one-year mark.
EHR Alignment
In the case of AI, Gadiwalla specifically pointed to EHR alignment as a major factor in sustainability.
“The browser based, embedded approach to the way Eleos integrates with the EHR system is fantastic, because you need little to no coordination with the EHR partner—and we all know how complicated things can get when you’re waiting on this giant EHR to start working with your workflows,” Gadiwalla said.
Less integration complexity also means faster time to value and fewer cooks in the kitchen, so to speak. “It reduces the number of people you have to bring to the table to make something successful,” he explained.
Workflow Alignment
Another key to operationalizing—and thus, realizing the greatest possible value of—a behavioral health technology is selecting one that works for as many use cases as possible. The more providers, specialities, and settings the tool can service, the greater your adoption and usage will be.
And speaking of working for providers, one of the most important—if not the most important—considerations when evaluating any technology solution is how it fits into the provider workflow. The more you ask staff to deviate from their established processes to use the tool, the less likely they are to stick with it.
In GRAND’s case, that meant customizing their use of Eleos to the collaborative documentation workflow their providers were already using. “Eleos had to come in and help us build that workflow, because not everybody has that,” Gadiwalla said. “They were able to take what we were doing before but augment it with the AI.”
Care Improvement
Brandon Ward, PsyD, Chief Innovation Officer & VP of Information Systems at Jefferson Center for Mental Health, said AI tools like Eleos actually help restore the human-to-human nature of behavioral health—something that’s gotten lost with the proliferation of EHR systems and documentation requirements.
“Of the AI tools out there related to documentation, the thing I like the most about Eleos is that it actually allows for folks to be able to give up some of that concurrent documentation and be really present with the human in the room, which is of course fundamental to what we’re doing as behavioral health providers,” Ward said. “For me, that’s a huge workflow win, because it doesn’t just make things faster; it has the potential to actually improve the quality of the intervention.”
Red Flag #2: Hands-Off Onboarding and Training
By now, most healthcare providers have used an EHR or EMR. Even telehealth platforms are becoming more commonplace. But AI is still relatively new to most providers, which means implementing an AI system requires a deeper level of user training and education.
At the same time, providers are busy people. They’re typically not chomping at the bit to learn a brand-new system, even if they can see the long-term benefit of it. That’s why Cally Cripps, VP of IT and Analytics at Aurora Mental Health & Recovery, says a balanced approach to user training is critical. Relying solely on self-serve options like videos and user guides is risky for time-strapped providers, which is part of why she found the Eleos onboarding experience so refreshing.
Intuitive User Experience
“Training needs to be almost seamless,” Cripps said. “We want the software to be fairly intuitive. And we found with Eleos that it was a very low bar to entry.”
Cripps says Aurora’s initial user group was up and running after one in-person training session—and it wasn’t long before more staff members caught wind of Eleos through word-of-mouth and expressed interest in giving it a try.
Comprehensive Onboarding and Education
Still, it’s important to prepare for a healthy amount of skepticism from providers whose preconceived notions about AI technology have made them resistant to it—or even afraid of it. That means your vendor must not only train people on how to use the technology—they also must be able to address questions and concerns about the technology and how it impacts client care.
“Given all the buzz around AI and privacy, it is expected that people are going to be concerned about the introduction of AI in their therapy sessions,” Gadiwalla said. “We wanted to make sure that our team members were comfortable using the product. It was really important for them to understand what the product is—and what it’s not—and how it’s going to impact the quality of care we provide.”
Personal Approach
While he admits it was a challenge to earn the trust of provider staff, Gadiwalla is quick to point out that Eleos was instrumental in helping his team overcome that barrier.
“The Eleos team was with us the whole way,” Gadiwalla said. “They brought their passion to the table and engaged our clinicians to help them understand how Eleos would work in our environment.”
And ultimately, he says that even though GRAND “had some naysayers at the very start, it didn’t take us too long to get them on board.”
Red Flag #3: Lack of Post-Training Communication
If a vendor peaces out post-training, leaving you nothing more than some tutorials and a laundry list of links to help articles, then you’re in trouble.
“I think we can all relate to how frustrating technology in general can be in our daily lives,” Ward said. “And by the time any of us reach the point of asking for help, it’s really hit a certain level.”
Quality Support
That’s why, in Ward’s eyes, the attitude, specialized knowledge, availability, and response time of your AI vendor’s support team is absolutely critical.
“Eleos does this well,” Ward said. “When we ask for help, there’s a genuine spirit of, ‘Let’s get this fixed.’”
Cripps says the quality of support Aurora received from the Eleos team was the main factor that enabled the organization to move forward with their implementation despite having only one project manager with minimal tech expertise to lead the rollout. “We don’t have a ton of internal resources,” she said. “Eleos took so much pressure off of us internally so we could focus on the engagement piece.”
Proactive Check-ins
In addition to providing a dedicated support resource for the questions that are bound to pop up post-rollout, your AI vendor should also proactively check in with you at set intervals to collect feedback, go over adoption and usage, report on impact and results, and help you address any issues that may arise.
The team at GRAND, for example, meets with their Eleos success team on a regular basis to go over metrics related to provider adoption, usage, and engagement. “That helps us improve the operationalization of the product over time,” Gadiwalla said.
Performance Data and Analysis
And on that note, make sure the vendor actually tracks this kind of data. Otherwise, you’ll be left in the dark when it comes time to assess the ROI of your investment.
“It’s not about simply deploying something,” Gadiwalla continued. “The goal is to make it sustainable and make sure that it remains relevant to your organization. One of the things that I feel really good about with the Eleos tech is that we’re able to get the right stats.”
Additionally, Gadiwalla said the Eleos team helps GRAND interpret that data and adjust strategies accordingly.
“The important thing is that you have a partner that’s willing to stick with you,” Gadiwalla said. “It’s easy to replicate tech, but it’s the work culture, the team-building, the people that really make or break it.”
Red Flag #4: Missing Security Certifications
To health IT pros, this probably sounds more basic than a pumpkin spice latte—but we’d be remiss not to mention it, especially considering that many new companies (particularly tech startups) deprioritize security certifications in the name of rapid development. Too often, it’s seen as something that “we’ll get to later”—once the platform is built out.
But in health care, that’s not a risk you can afford to take. You want a tech partner who understands and respects the importance of privacy and security as much as you do—and who has the safeguards in place to prove it. That means working only with AI partners who have things like HIPAA, HITRUST, and SOC 2 on lock—along with clear processes and protocols around data retention, anonymization, and tokenization.
“It’s safe to say that if a product does not meet our security and compliance requirements, it’s not even part of the conversation,” Gadiwalla said. “We just can’t have the attitude of, ‘Let’s get it off the ground, and we’ll figure it out as we go.’”
At Jefferson Center, Ward says privacy was a chief concern among providers being introduced to Eleos. “It’s a healthy question that we all should be asking these days of companies that use our data,” he said. “All of that goes back to building trust. Our providers need to trust that the information is being used responsibly.”
For a more comprehensive breakdown of all the questions providers should be asking about any AI tool they are introduced to, check out this blog post. (And if you’re in a leadership role, make sure you’ve checked those boxes before rolling out a new tool to your provider staff.)
Red Flag #5: “Canned” AI Output
Not all AI is created equal. It’s easy to slap an AI label on something like dot phrasing and call it intelligent, but template-based AI tools and true generative AI technology are about as similar as an airliner and a rocket ship.
Dot phrases (also known as “smartphrases” or “autotext”) can be helpful, and they do help cut down on provider documentation time. But this type of shortcut might end up costing you in the long run, as it relies on “canned,” pre-programmed templates—greatly increasing your chances of duplicate text across notes.
Solutions like Eleos, on the other hand, generate unique notes based on the content of each individual session, curbing the incidence of cloned or “copy-paste” notes. This also helps ensure the note captures the full clinical richness of each session, including interventions used and themes discussed. Behavioral health providers know that no two client sessions are exactly alike—which means no two progress notes should be alike, either.
“For behavioral health, there is a large amount of narrative information being captured in the intervention style—it’s a characteristic of the intervention,” Ward said. “And Eleos built their solution around bringing that to life.”
Red Flag #6: Broadly Trained AI Models
To ensure not only the uniqueness—but also the clinical accuracy and relevancy—of each note generated, the AI models must be built with, and tailored to, behavioral health sessions.
General use AI tools—ChatGPT and the like—are just that: general. The models that generate their output have not been trained on behavioral health data. So, unless you’d trust any ol’ Joe Schmoe with clinical documentation in your org, you probably don’t want to rely on a broad-use AI note-writer.
Specialization in Behavioral Health
“Being behavioral health-specific really does make a difference,” said Gadiwalla, adding that GRAND looked at several AI systems before ultimately choosing Eleos—and not all of them specialized in behavioral health. “There are plenty of solutions out there that can transcribe an in-person session or a telehealth session, and they might even summarize the session for you. But do they really fully understand the context? Can they identify the different clinical techniques that the clinician used during the session? The effectiveness of the AI model really depends on the quality of the training data.”
Clinically Relevant Output
Even broad-use healthcare AI solutions probably haven’t been trained on the nuances and complexities of behavioral health conversations, leaving them vulnerable to missing or misinterpreting key discussion points and therapy interventions. That means more manual work on the provider side, which adds friction to adoption and consistent use—and kind of defeats the purpose of implementing such a tool in the first place.
At Jefferson Center, Ward says highlighting the depth and quality of Eleos-generated notes played a major role in not only the original decision to implement the software, but also the adoption of the tool at the provider level. “When we asked folks to really reflect—especially to look at each other’s notes—it was hard-pressed for anyone to say anything negative about the quality of the documentation,” he said. “It represented what happened, it did what it needed to do, and it did it well. And so when we anchored in that…we started to see more movement and greater acceptance.”
Only a tool whose AI models were developed specifically for behavioral health professionals by behavioral health professionals is going to capture client sessions with a high degree of detail and contextual accuracy. Your AI vendor should be able to confidently explain how their models are trained, fine-tuned, and continually updated—as well as how this process incorporates real-world behavioral health data and human clinical experts.
Clinician-Centered Architecture
To that end, it’s also important for the technology to respect the expertise of the human end user: your providers. That’s precisely why the Eleos technology is rooted in Augmented Intelligence—a flavor of artificial intelligence that focuses on enhancing or augmenting, rather than replacing, human clinicians. An AI tool should never try to do a clinician’s job—it should simply make it easier for them to focus on the important work that only they can do.
Some pairings are doomed from the start, but your behavioral health organization and your AI vendor shouldn’t be one of them. To make sure it’s a match made in heaven, be on the lookout for the six red flags above—and don’t be afraid to swipe left until you’ve found an AI partner who checks all the right boxes.
Ready to see why leading behavioral health organizations across the country have fallen in love with Eleos? Request a personalized demo here.