Artificial intelligence (AI) is getting a lot of attention these days, and it probably should, considering the promise of how transformative the technology will likely be for large swaths (specifically: all) of society. For many people, the phrase probably brings to mind something like a sentient computer that won’t let you do that, Dave, but that’s not really what the technology is about.
Artificial intelligence (used here as a catchall term) is more about equations and computer functions that improve the more they run, and is already used in everything from providing autocomplete for text messages, to making movie recommendations, to powering self-driving cars. (Side note: there is a ton of excellent content on YouTube that is accessible enough for anyone interested in taking a deeper dive into the technology.) AI has been making inroads in healthcare too, and we’ll see it continue to creep into the wound center. Unless you’re maybe pitching to a VC firm, it’s difficult to overstate how much of an impact AI could have, but what follows are some potential near-future applications.
Image recognition and analysis is one of the cornerstone use cases for AI, and it’s already really good. Current AI analysis of radiology scans has roughly a 90% accuracy rate and reviews images in a fraction of the time it takes a human. Those results will continue to get better and faster as models improve and computational power increases.
The potential for wound care is tremendous. Digital planimetry (i.e. taking measurements) already works and is relatively well established, but that’s just the beginning. As image analysis gets better and regulatory issues get worked out, we could get wound depth, wound type, location, and status just by taking a photo. Smart documentation systems could identify the patient and attach it to their record by using a smartphone camera viewfinder. The potential here for workflow — to say nothing of the doctor’s time and sanity — is tremendous.
Natural Language Processing
In plain English, natural language processing (NLP) is the ability for computers to process and understand speech. For many of us, the most common use thus far has been something like “okay Google, when is the Super Bowl this year?” or “hey Siri, remind me to pay the water bill tomorrow.”
What about in the wound center? NLP will be a game changer for analytics and interaction with the EHR. Clinicians and physicians will be able to pull up a patient chart by asking for a patient by name, then add a note by dictating into the system, just as they would an assistant. Despite being mainstream technology for well over a decade, EHR satisfaction remains low, with usability consistently a top complaint. NLP has the promise of winning over reluctant (maybe even hostile) users.
The potential for analytics is equally exciting. Imagine no longer relying on a set of packaged reports, and instead, just asking your EHR for the information you want. “How many new patients did we have last month?” “What’s the average time to heal for lower extremity pressure ulcers?” “How much did we spend on alginate dressings this year?” That’s the promise of NLP, and looking at some of the market leaders, we’re not too far off.
Non-structured data analysis
EHRs are loaded with non-structured or non-discreet data that is, for all intents and purposes, currently impossible to make sense of. That’s data like the free text entry of physician notes and orders, or PDFs of scanned consents and lab results that are stored as media assets. As AI advances, it will we able to determine the semantic meaning in those files and fields, and make it queryable, analyzable, reportable, and actionable.
This ability not only fulfills one of the big promises of electronic health records, it also addresses a major shortfall in the interoperability space. Right now, when systems share data, they’re often doing the electronic equivalent of trading stacks of paper.
Now, for some of the applications that may generate a little less enthusiasm. The ability to find meaning and insights from EHRs, patient charts, and other data sources won’t be limited to hospital executives, wound center staff, and providers. Auditors, payers, and regulators will use the same technologies to find missing, incomplete, or potentially fraudulent documentation and use that information to deny claims, recover payments, or even identify likely targets for legal action.
Evidence suggests that patients benefit from standardized treatment guidelines and protocols (ironically, the clinical guideline industry is now massive enough that navigating the industry itself is a challenge unto itself). Advanced AI, like IBM’s Watson (most famous for its Jeopardy wins), has already shown utility in healthcare, being able to make diagnosis suggestions based on complicated data analysis from a wide range of sources, including images. The direct connection of those dots is a Watson-like program that takes a wound photo, combines it with the patient record from the EHR, and spits out a treatment protocol and healing projections.
There would be, of course, challenges to overcome: the reception from clinicians and physicians remains to be seen, and the legal implications of an “incorrect” recommendation might be fascinating to see played out, but the real promise may be outside of the hospital setting — skilled nursing facilities, physician offices, home health — where providing quality wound care to patients has been historically difficult.
Regardless of the application, the biggest challenge to overcome will be the human component. How do we integrate and learn to trust AI with decisions affecting the treatment and wellbeing of patients? As machine learning advances and improves, how heavily do we weigh the results from an algorithm over those of an expert, especially when they disagree?
Those may seem like far-away problems to worry about, but only 10 years ago, self-driving cars were widely considered impossible for the foreseeable future. Technology moves quickly, and the applications are promising and exciting.