‘Facts Over Fear:’ How Providers Can Reassure Patients With AI

AI Basics


Hospitals and health systems can reduce patient anxiety about AI by discussing the due diligence and care that organizations have taken in selecting AI tools and evaluating the role of AI tools in patient care. CIO of hospital and health system said beckers.

A Pew Research Center survey conducted Feb. 22 found that 60% of Americans were uncomfortable with doctors using AI to diagnose diseases and recommend treatments. .

Brad Reimer, CIO of SD-based Sioux Falls, has suggested that one way to combat this is to educate caregivers in hospitals and health systems about artificial intelligence.

“At Sanford Health, we have been working closely with one of our academic partners to develop the Demystifying AI in Healthcare educational series for caregivers,” said Reimer. beckers“Ultimately, our goal is to provide continuing education credits or certifications to caregivers who complete this program, which we consider fundamental from a provider and patient perspective. .”

Reimer said caregivers need to understand the basics of AI. This includes how the model is trained, how to determine if the model is relevant to the patient, and whether the AI’s recommendations are contextual or binary.

According to Reimer, this knowledge not only empowers healthcare providers to make clinical decisions, but also allows them to have trusted conversations with patients about the benefits of AI and its safety.

“Broader public awareness campaigns focused on ‘facts over fears’ will also help foster and strengthen patient trust in AI,” Reimer said.

Stick to FDA guidelines

Simon Linwood, M.D., CIO of UCR Health, based in Riverside, Calif., says another way to make patients more comfortable using AI in clinical settings is to follow the FDA’s recommendations for the development and use of AI. to follow the guidelines of

On September 28, the FDA released new guidance on the use of AI-driven clinical decision support tools. This guidance provides suggestions on how to keep these tools transparent, explainable, and validated and overseen in a way that is easily understood by healthcare providers and patients.

“By following FDA guidelines, healthcare providers can build trust with patients and improve patient satisfaction with the use of AI in medicine,” said Dr. Linwood.

avoid jargon

“When explaining the implications of AI in healthcare to patients, it is important to use simple language, avoid jargon, and explain AI in a clear, concise, patient-centered manner,” says Seattle Children. said Zafar Chaudry, MD, CIO, and Chief Digital Officer at Microsoft. .

If healthcare organizations start with the basics, such as explaining what AI is, how it works, and how it can improve diagnosis, treatment, and care delivery, patients will learn about AI. You can familiarize yourself with the concept. Zafar.

But acknowledging that concern must also be a priority.

“Acknowledge the concerns and explain that AI is intended to complement, not replace, patient and provider experiences,” said Dr. Zafar. “Please also discuss the measures in place to protect patient data.”

don’t make the patient wonder

Patients reasonably educated about AI are right to be skeptical, and use AI to educate patients on exactly how this rapidly evolving new technology is being used at the facility. It’s all about systems doing the right things to keep up, said Randy Davis, vice president and vice president. His CIO at CGH Medical Center, based in Sterling, Illinois, said: beckers.

“We shouldn’t leave patients wondering,” Davis said. “The medical system needs to create a narrative or it will create a narrative for them.”



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *