What it takes to engage clinical workforces on AI
Photo: HIMSS Media
BOSTON – When asked about the evolution of artificial intelligence and when sentient AI would become integral to the fabric of healthcare, Dr. Patrick Thomas, director of digital innovation in pediatric surgery at the University of Nebraska Medical Center College of Medicine, said clinical training needs revamping.
"We need to prepare clinical students to go into the real world that they are moving into," he said at the HIMSS AI in Healthcare Forum on Thursday.
As the host of a panel on how to support the healthcare workforce to trust and adopt AI – which highlighted the importance of governance, keeping humans in the loop and shared responsibility of maintaining clinical AI quality with developers – Thomas asked his fellow panelists how they address clinicians' skepticism and doubt.
Joining Thomas for the discussion were Dr. Sonya Makhni, medical director of the Mayo Clinic Platform, Dr. Peter Bonis, Wolters Kluwer Health's chief medical officer, and Dr. Antoine Keller of Ochsner Lafayette General Hospital.
Broadening clinical resources
Using large language models to dissolve the intense cognitive burdens clinicians experience is still fraught with complications, from bias and data hallucinations to costs.
Bonis noted that it's likely application developers will end up paying the costs to develop systems on top of the cost of innovating the underlying foundational models.
Keller added healthcare has more information than the clinical workforce to assimilate.
"We don't have enough manpower to accommodate" accurate clinical decisions in a timely fashion, he said. While clinicians are focused on risk innovation – building safeguards to obviate concerns of AI usage – giving clinicians the comfort level to adopt is essential.
Keller, a cardiac surgeon, described how Louisiana-based Oschner Health is providing community health partners with an AI-enhanced tool called Heart Sense to drive interventions.
By making diagnoses in underserved communities with inexpensive technology, the AI tool "geometrically expands the workforce," he said.
Improving access in underserved communities
Leveraging the heart screening tool not only improves Oschner's utilization, it focuses attention on patients who need it most.
Thomas asked what it means to have AI influence care when a healthcare organization does not have data scientists on staff, how Oschner's community health partners gain an understanding of the AI tool and how that tool is used.
There's lots of hesitancy where the communities they serve are ameliorated by hand-holding, Keller said.
"You have to be present and be aware of the speed bumps or issues people have using the technology," he explained.
However, those who receive the technology in areas where there is a dearth of medical providers are grateful for it, he said.
One of the main diagnostic criteria – that patients have a heart murmur – is required before a patient can have an aortic valve replacement. Using the AI-driven screening tool, the health system found that 25% of those over age 60 in its communities will have a pathogenic murmur – which can be cured with surgical intervention.
"The prevalence data shows that there is a vast proportion of the population that is undiagnosed," said Keller.
Using AI to understand which patients are at risk that can be cured is a significant advantage in treating patients before they develop irreversible dysfunction.
But acceptance depends on having a workforce that is educated, he said.
"Visually – with something that is concrete," and easily understood even with a low-level education, he said.
Stakes and shared responsibility
"The stakes are so high for clinical care," Bonis acknowledged, noting that always keeping a human being in clinical decisions is a guiding principle in developing trustworthy AI under a range of nuances.
While frontline clinicians are not always interested in the "sausage making" that is AI, "from a vendor lens, caution is critical," he said.
The question for Makhni is how do you bridge expertise for the whole life cycle, she said.
The Mayo Clinic Platform works directly with AI developers and the Mayo Clinic to look at how to go about deploying clinical AI in a way that is also user-centered – "and then communicating that information transparently to empower the end user."
Such a multidisciplinary review can determine if an AI developer tried to assess for bias, for example, and that information can be passed on to clinicians in a way they understand.
"We meet [developers] where they are on their journey," but the goal is to represent the principles of safety, fairness and accuracy, she said.
Considering the digital divide and asking the clinical workforces what their worries are is pivotal to delivering safe AI systems and a burden that cannot just fall on users.
"Sometimes it should also fall on the solution developer," she said. "We have to have a shared responsibility."
While healthcare is not going to solve all the challenges of AI quickly, "we can go in a metered approach," she added.
Andrea Fox is senior editor of Healthcare IT News.
Email: afox@himss.org
Healthcare IT News is a HIMSS Media publication.