Across healthcare systems, the excitement around generative AI is palpable—from diagnostics that interpret imaging with near-human precision to chatbots that streamline patient communication. Yet the same momentum fueling this technological revolution also exposes the cracks beneath it: siloed systems, governance challenges and talent shortfalls that hinder adoption at scale.
According to a recent survey, more than 80% of healthcare executives see AI as vital to their strategy, but a mere 13% feel they have a clear strategy for how to use it. Members of the Senior Executive Healthcare Think Tank—a curated community of leaders across patient experience, policy and health innovation—agree that the promise of AI will remain unrealized until foundational gaps in governance, data infrastructure and workforce readiness are addressed.
Below, they dive into these gaps and what healthcare leaders must do to close them before AI can have any meaningful impact on current healthcare systems—because, as their insights show, AI can’t scale without trust, integration and people who understand both medicine and machines.
“Integrated AI isn’t just a tool for efficiency—it’s the foundation for a smarter, more affordable and more patient-centered healthcare system.”
Building Integration Into AI Governance
For Eugene Zabolotsky, CEO of Health Helper, the most pressing issue isn’t AI’s capability—it’s connectivity. “For AI to deliver true impact in healthcare, the critical gap is integration,” Zabolotsky says. “Today, models are being built for diagnostics, operations and patient engagement in isolation. Without a connected framework, these tools risk adding complexity instead of value.”
Health Helper, known for its nature-inspired medical technologies, exemplifies how innovation can only thrive when systems work together. Zabolotsky argues that the same principle applies to AI: “The opportunity lies in unifying AI models across systems—EHRs, payers, providers and patient-facing platforms—so insights flow seamlessly and support better decisions at every level of care.”
Research supports his call for interoperability. A 2024 McKinsey report found healthcare leaders citing trouble with legacy systems and data quality as top barriers to AI implementation. Without shared governance and standardized data exchange, models cannot scale beyond narrow use cases.
Zabolotsky envisions a future where governance enforces interoperability and transparency: “Integrated AI isn’t just a tool for efficiency—it’s the foundation for a smarter, more affordable and more patient-centered healthcare system.”
“Safeguarding patient information is crucial while implementing AI solutions.”
Empowering Private Practices Through Secure Transformation
While enterprise-level health systems dominate AI headlines, Feri Naseh, Founder and CEO of MeTime Healing LLC, believes smaller practices will determine how deeply AI embeds into care delivery. “AI is here to stay! It might take some time before private practices implement AI solutions, but it is inevitable for the future care of patients,” Naseh says.
Naseh emphasizes that security and trust are essential for smaller organizations venturing into AI. “Safeguarding patient information is crucial while implementing AI solutions,” she explains. “Introducing AI into healthcare can feel like a big shift for physicians and staff who are trained to be HIPPA compliant and safeguard patient data.”
Data breaches are increasing every year, making cybersecurity governance a top organizational priority. For private practices—where resources are limited—Naseh says the key is transparency and collaboration: “By approaching the process with openness, clear communication and early engagement, private practices can ease concerns and build alignment.”
She adds that digital transformation succeeds when staff feel involved: “This collaborative approach helps teams feel supported while moving together toward a future of digital transformation. AI is the future!”
“We are using AI not to replace staff but rather to handle the undifferentiated heavy lifting so clinicians can focus on being clinicians.”
Redefining Talent and Culture for Generative AI
At Electronic Caregiver, Inc., Mark Francis, Chief Product Officer, has already seen the impact of generative AI when paired with clear organizational purpose. “AI is delivering meaningful, scalable impact today,” Francis says.
Francis explains how ECG achieved this transformation internally: “We adopted generative AI within our care management team for ambient scribes, intelligent summarization, sentiment analysis and virtual QA agents. The results were transformational: 2x billable work by our clinical teams, and 30x speed to complete QA audits with 100% encounters audited versus 5% when completed manually.”
The secret wasn’t technology alone—it was culture. “The key to such outcomes was clear corporate direction. We are using AI not to replace staff but rather to handle the undifferentiated heavy lifting so clinicians can focus on being clinicians,” Francis says. “Addressing these issues head-on, proactively and with transparency aligns stakeholders across the organization.”
When employees are trained as “AI collaborators” rather than displaced workers, AI transformation succeeds. “Rolling out AI can be disruptive and threatening to staff,” Francis continues. “With this mandate, staff embraced AI and were active in defining, developing, training and deploying various generative AI agents across the enterprise.”
Next Steps Toward Meaningful Impact
- Prioritize interoperability across systems. Meaningful AI depends on unified data flow between providers, payers and patient-facing platforms.
- Build trust through transparent governance. AI adoption requires patient data security, clear communication and early engagement to build internal and external trust.
- Position AI as a workforce enabler, not a replacement. Clinicians embrace AI when they see it amplifies their value, not diminishes it.
The Path Forward for Responsible AI in Healthcare
Healthcare’s generative AI revolution is not just about smarter algorithms—it’s about smarter leadership. The members of the Senior Executive Healthcare Think Tank agree that the next stage of progress will come from governance frameworks that promote interoperability, secure infrastructure that earns patient trust, and workforce strategies that empower humans to work alongside machines.
If those foundational pieces fall into place, AI won’t just make healthcare faster—it will make it more human, more accessible and more sustainable for the future.
