It’s all anyone in healthcare wants to talk about — AI. At least, you’d think so from an AI-evaluation of all the presentation titles at the recent HIMSS 2024 conference.
In one of the many AI presentations, a few executives at several prominent health systems, including Honor Health, Banner Health/Aetna, and Holston Medical Group, gave their take on AI use cases for healthcare — both big and small. I’ve summarized a couple of interesting ideas and considerations for you. I also added in a few AI hacks you could start doing today, although admittedly those are more administrative in nature and not specific to healthcare.
There was a lot of talk about using AI to help relieve clinician's documentation burden. Here’s some small ideas that could have a big impact:
An example of what an AI-generated patient summary might look like this: Summary of our conversation: Your blood sugar levels on this visit: XX Your target rate is: XX Your blood pressure rate on this visit: XX Your target rate is: XX Your next steps to take before our next appointment: Check your blood pressure at home every morning and night and write down your numbers to bring me at our next visit. Bring your home blood pressure machine with you at our next visit. Make an appointment with this CARDIOLOGY OFFICE (OFFICE PHONE NUMBER). Do your labs fasting one week before our next visit. Your next visit with me is: DATE ATTACH DIET PLAN ATTACH RESOURCES FOR EDUCATION |
Some practices are already offering something similar to this, but the key is that the AI software is deriving notes from what was said during the visit. It summarizes what the clinician said and can be given to the patient as they are leaving. Virtually no administrative burden.
Now, how about all the documentation burden facing clinicians? What if, after a visit, the AI software prompts the clinician with a few questions at the end that are derived from the conversation? Would you like a lab ordered for the patient? Patient mentioned they had a flu shot in December; would you like this added to the record?
From a Quality angle. What if we could ask a bot where a code is in our EHR? Where do we document X in our EHR? What codes are available for this specific Value Set. Or even ask it to review a specification and tell you what the patient population is. What measures are we required to report in the IQR program this year?
“If we just use AI to say, well, you used to be able to see 20 patients a day, now you can see 35, then we've missed the point completely,” Dr. Scott Fowler, CEO of Holston Medical Group, said.
AI can provide significant value in healthcare, but the focus should be on enhancing patient care rather than increasing throughput. Dr. Robert Groves, CMO at Banner | Aetna, emphasized the importance of human relationships in healthcare. “How can we use AI as a tool to serve human relationships?” Dr. Jim Whitfill, Digital Healthcare Strategy and Operations at HonorHealth, echoed this sentiment, stating that while AI can help streamline certain processes, it should not replace the human element in patient care.
The panel agreed that Clinical Decision Support (CDS) was at the forefront of healthcare’s most ambitious AI dreams.
The panel talked about some really amazing results coming out of imaging right now. They referenced recent research that showed the AI software was better at correctly interpreting diagnostic images, and that unfortunately (scarily) the AI performed better without the assistance of clinician review.
Boy, this all sounds amazing! But we don’t quite trust it yet. In fact, the panel chair, Paul Battle, Lenovo’s Executive Director, referenced a recent survey which showed trust in AI is way down. Half the U.S. population trusts AI in healthcare and half do not.
Nathan Bay, head of mergers and acquisitions at Citi Financial, pointed out that trust in AI also hinges on how organizations handle and use the data. “Governance is important because what you put in is going to impact what comes out. And how you think about the kind of the governance regime that sits over the inputs is important.”
Here’s my favorite quote from the session. Nathan said, “I think that in many new technologies there's an overestimation of the impact in the short term and an underestimation of the impact in the long term.”
We are currently in an AI hype cycle. AI will not solve all your problems today. It’s not really ready for you…yet. But the long-reaching consequences are not fully understood. In the same way, none of us could understand how the internet would revolutionize everything when it came into the mainstream. Now, looking at the incredible (and frightening) things the internet has provided for us in a couple of decades, we should use that to carefully govern how we adopt AI in our healthcare system.
Well, if you’re one of the lucky few whose organization has invested in AI resources — either through internal means or with an external AI vendor — you probably have a few ideas. If you don’t have designated AI resources available to you, you can still play with a few things.
Do not insert anything that is PHI. This should go without saying, but be careful. If your meeting recording has patients mentioned during the call, that would be PHI. ChatGPT is an open-source AI software program. There are a ton of issues with using this open-AI source model. There are copyright issues surfacing as well. So, make sure you aren’t using it to generate content that wasn’t derived from you to begin with. And always ask it to share its sources if it’s giving you a definitive answer so you can validate what it said.
In conclusion, while there is a lot of talk about AI in healthcare, implementation and impact are still in the early stages. The potential for AI to relieve administrative burden, enhance decision support, optimize resource allocation, and improve retro analytics is promising. However, trust in AI and the proper governance of data remain significant concerns. It is important to approach AI adoption in healthcare with caution, learning from the lessons of the internet revolution. While we may not have all the answers today, the long-term impact of AI in healthcare cannot be underestimated. As we continue to explore and develop AI capabilities, it is crucial to prioritize patient care and maintain the human element in healthcare interactions.
(^^I asked an AI bot to write me a concluding paragraph for this article I wrote. What do you think?)
Medisolv Can HelpThis is a big year for Quality. Medisolv can help you along the way. Along with award-winning software, you receive a Clinical Quality Advisor that helps you with all of your technical and clinical needs. We consistently hear from our clients that the biggest differentiator between Medisolv and other vendors is the level of one-on-one support. Especially if you use an EHR vendor right now, you’ll notice a huge difference.
|