AI for Behavioral Health: 5 Essential Insights for Providers

Published:
AI-healthcare

“AI is like a boulder rolling down a hill. Right now, we can still nudge it in the right direction.”

Artificial intelligence is transforming behavioral health care at a pace that’s both exciting and daunting. In just one year, the number of National Council webinar attendees who reported using AI in their organization’s daily activities nearly doubled — jumping from 15% in 2024 to 29% in 2025. To help the field keep pace, the National Council’s five-part webinar series AI for Behavioral Health explored the most pressing AI considerations facing the field — from policy frameworks to real-world implementation. Here’s what you need to know to navigate the changes that come with AI responsibly and effectively.

1. From Policy to Practice: Set Guardrails Without Stifling Innovation

Zach Boyd from Utah’s Office of AI Policy put it this way: “AI is like a boulder rolling down a hill. Right now, we can still nudge it in the right direction.”

Utah has emerged as a pioneer in this space, becoming the first state with an Office of AI Policy focused specifically on mental health. Their approach balances two critical needs:

  1. Protecting consumers through HIPAA-like safeguards for chatbots and limits on targeted advertising.
  2. Enabling innovation through safe harbor provisions for responsible developers.

orange arrow pointing right icon Take action: The regulatory landscape for AI in mental health care is evolving rapidly — don’t wait for perfect clarity to get started. Start familiarizing yourself with emerging frameworks now, and remember that guidance letters increasingly encourage professional judgment over rigid rules.

orange arrow pointing right icon Watch: From Policy to Practice: Behavioral Health Therapists Using AI and Emerging Technologies.

2. Securing the Future: Move Beyond HIPAA Basics

When it comes to AI and data security, HIPAA compliance is just the starting point. As Christine Sublett from MettaHealth Partners emphasized, “HIPAA is the floor, not the ceiling. The current threat environment demands more.”

There are also risks with implicitly trusting the data underpinning AI tools. Skewed results from AI systems are inevitable, but also measurable and improvable. The critical question isn’t whether inaccuracies exist — it’s how vendors are addressing them. Look for transparency about training data sources, auditing practices and additional certifications like SOC 2, ISO 27001, ISO 42001 or HITRUST.

Questions to Ask Vendors:

  • What data do you use to train your AI, and how do you mitigate skewed results?
  • What external certifications do you hold beyond HIPAA compliance?
  • What are your breach notification and reporting procedures?
  • Can you explain your data practices in plain language?

orange arrow pointing right icon Take action: Review the American Psychological Association’s Digital Badge Program, which offers a comprehensive framework for evaluating AI tools across five dimensions: intent, evidence, safety, data protection and usability. Also, check your state’s emerging AI legislation. States like Illinois are already implementing specific rules around AI in mental health care.

orange arrow pointing right icon Ethical obligation: Always inform clients when AI is being used in their care. Transparency builds trust and aligns with your professional values.

orange arrow pointing right icon Watch: Securing the Future: What Behavioral Health Providers Need to Know About AI and Data Security

3. AI in Action: Learn from Early Adopters

Community providers who’ve already adopted AI tools share a common entry point: documentation relief. Whether it’s addressing administrative burdens or tackling a different pain point in your organization, it’s important to try something Nikki Stenitis, chief clinical officer at New Vista, said: “Making no decision is a decision. Waiting means falling behind.”

Jim Martin, chief of clinical innovation at Community Services Group added, “Old ways aren’t perfect either. Change feels scary because we’re used to them.”

The return on AI investment extends far beyond productivity metrics. Early adopters report improvements in staff retention, clinician wellbeing and client engagement — outcomes that matter just as much as efficiency gains.

Implementation best practices:

  • Start with a pilot program and revise based on feedback.
  • Frame AI adoption around quality of care and staff experience, not just revenue.
  • Overcommunicate throughout the process and identify internal champions.
  • Establish governance structures with an approved tools list and clear policies.
  • Provide robust training to prevent risky “shadow AI” use.

orange arrow pointing right icon Take action: Begin internal conversations about the burdens AI could alleviate in your organization. Documentation is often an obvious choice, but consider future applications like compliance automation, intake optimization and workforce training simulations.

orange arrow pointing right icon Watch: AI in Action: Community Behavioral Health Providers Share Lessons Learned

4. Co-creating Technology Solutions: Build With Communities, Not For Them

Authentic co-creation isn’t about retrofitting a pre-built solution with token feedback from stakeholders. It starts with defining the problem together.

Framework for successful co-creation:

  • Problem: What are you trying to solve for?
  • Intervention: What type of tool solves the problem?
  • Mechanism: How should it be delivered (telehealth, embedded in social media, school-based)?
  • Experience: Does it resonate with the people who will actually use it?

Sarah Lampe, president and CEO of Prime Health, captured the challenge perfectly: “Providers and innovators are solving different problems. We need spaces where those worlds collide.”

Impediments to overcome:

  • Speed-to-market pressures that sideline authentic engagement
  • Complex funding models and state-by-state regulations
  • Lack of incentives for providers to participate in co-design
  • Economic constraints on relationship-building timelines

orange arrow pointing right icon Take action: When evaluating new tools, ask developers about their co-creation process: Who was involved? At what stage? What specific insights shaped the final product? If a vendor can’t answer these questions clearly, it’s a red flag.

orange arrow pointing right icon For Organizations: Create internal frameworks for collaboration with technology developers. Incentivize your staff to participate in co-design opportunities, through protected time or another structural support.

orange arrow pointing right icon Watch: Co-creating Technology Solutions: The Power of Provider-Client-Developer Collaboration

5. The Future Is Integration, Not Isolation

Tanzeem Choudhury, PhD, of Cornell Tech emphasized, “The future isn’t one technology — it’s stitching them together into a better care continuum.” The behavioral health field is moving toward precision care, using digital biomarkers, wearables and AI to track behavioral indicators like sleep patterns, social isolation and energy levels.

Emerging Applications:

  • Personalized care pathways: AI-powered stepped-care models that determine the right treatment, at the right time, at the right intensity
  • Provider skill-building: AI tools that monitor therapy sessions and provide feedback, accelerating training and improving fidelity to evidence-based practices
  • Conversational AI: Chatbots will become ubiquitous, making guidelines and escalation protocols essential

orange arrow pointing right icon The Regulatory Challenge: As Stephen Schueller, PhD, from the University of California — Irvine noted, “We know what bad looks like. We don’t yet know what good looks like.” The field needs evolving frameworks, certifications and community-driven evaluation.

orange arrow pointing right icon Take action: Prepare your organization for an identity shift in behavioral health care delivery. Digital and human care will coexist, and seamless integration will be the competitive advantage. Advocate for government and industry investments that incentivize interoperability and safety.

orange arrow pointing right icon Watch: Where Is All This AI Going for Behavioral Health Providers? A Look Into the (Not-so-distant) Future

Your Next Steps

AI presents a unique opportunity for the behavioral health system to expand access to care, improve patient outcomes and reduce the administrative burden on providers. As a field, we must commit to implementing AI safely, ethically and transparently.

Take These Steps Today:

  1. Review your current data security practices and vendor agreements.
  2. Explore opportunities to participate in co-designing technology solutions.
  3. Join advocacy efforts for policy reform in your state.
  4. Establish internal governance structures for AI tool approval and training.

As Schueller concluded, “Don’t let ‘perfect’ be the enemy of ‘pretty good’ — AI can help us move the field forward.”

The boulder is rolling. Let’s nudge it in the right direction.


Interested in diving deeper? Watch the on-demand recordings of the full five-part webinar series to hear directly from policy leaders, security experts, community providers and researchers shaping the future of AI in behavioral health.