6 components of health IT success (or failure)
Over the past 10 years, we have seen a steady stream of health technology and technology-dependent enterprises fail. Recent examples have been Walmart and VillageMD. Haven disappeared despite heavyweight backing from Jeff Bezos, Warren Buffett, and Jamie Dimon. CVS continues to have difficulties with their retail health offering. Telemedicine companies such as Teladoc and Amwell have become interesting and sometimes very useful additional tools but they have been far from transformational. Now, we are looking at all surge of healthcare "AI" products. Everywhere, an expert is talking about how AI will transform healthcare. I hear a lot of high-level predictions, but what are the details? Will the magic of AI and modern data science techniques be the path to universal success? How will these AI platforms accomplish this?
In my experience, four areas of focus need to be addressed for an individual health IT project to succeed. Many projects focus on one or two, but few address the full spectrum.
Usability. First, the system needs to be designed in a usable manner. The findings of KLAS' Arch Collaborative demonstrate that even very advanced systems can be deployed poorly, while comparatively basic systems may outperform expectations. Although the vendor can make a difference, the bigger factor is the organization's will to maximize a system's capabilities. Then, beyond the obvious concept that a tool needs to be usable, the link between usability and data is frequently overlooked in the data lifecycle. This contributes to the low quality of data in our systems.
For instance, I recall a sepsis project where the hospital tasked nurses and physicians to fill out a separate form that forced the physicians and nurses to manually abstract the key data points. This should have theoretically resulted in good data, albeit through labor-intensive workflows. Unfortunately, it forced end-users to follow a process that made little sense to them, and they entered data to get past the form and on with their job of caring for patients. The data-gathering process was inefficient and prone to errors. For those of you on the front lines of care, I think this sort of clunkiness will sound uncomfortably familiar. In contrast to this example, processes should produce the necessary data as a natural byproduct of elegant and efficient workflows.
Data governance and curation. The data exhaust of user workflows should produce clean, accurate, and discrete data. However, raw data generally lacks meaning. For example, having an abnormal hemoglobin can mean something, but it means even more if it is combined with a previous hemoglobin or if it is combined with the fact that the patient is receiving chemotherapy or has renal failure.
Data governance comes into play in this curation process. How do we agree on what is abnormal and significant, and in what context? Is a hemoglobin of 10 significant in a renal failure patient? What if the value was 13 two months ago? Another example is a common controversy with which all hospitals struggle: what is the definition of "length of stay?" Do you start counting when the patient arrives at the ER? Do you begin counting once the admittance order is entered? What about when the patient physically moves from the ER to an inpatient bed? Do you stop the clock after the discharge order is entered, or do you wait until the patient has actually left the hospital? As we take raw data and add meaning, we need consensus and governance so that clinically significant anemia, length of stay, or any other important concept means the same thing to different stakeholders.
Data processing. Once we have data (ideally well-curated and clean), you must process it. This processing consists of taking multiple data points and combining them into a logical framework to come up with a new insight. The current hype around AI fits in this bucket. However, "processing" also includes more ordinary descriptive analytics and even simple pivot tables. It is still astounding to me how many organizations want to jump straight to neural networks and LLMs but still struggle with basic descriptive analytics. For example, think of the "length of stay issue" as essentially two raw data points: the start and stop instances. But to help you make operational decisions based on this data definition, you need to combine the simple definition with other data points (which also have to be well-defined and governed): Has the patient arrived on a weekend or holiday? The patient has what comorbidities? Combining and synergizing several data sets increases the power of the data to produce new meaning and understanding. Whether you use descriptive analytics or advanced data science tools, the end game is finding opportunities to improve your operational processes and clinical outcomes.
New insights. You will get new insights and opportunities for new tools from the processing step. The insights can range from a simple realization that a new order set is not being used correctly to large-scale changes like realizing you will benefit from spinning up a new surgical program. Examples include many of the now familiar predictive models we use (like sepsis) and the emerging language models that promise to do things like summarize entire records and respond to the tidal wave of messaging overwhelming our clinicians. Once such new insights become available, you need to consider the best way to incorporate them into operational and transactional workflows. If you do it right, you will head back to step 1 and create elegant ways that make it painless for end users to incorporate these new insights.
These four steps ideally create a positive feedback loop that facilitates users' work, generates data and new insights that, in turn, help to develop even more new insights, and so on. These 4 steps are necessary, but they are not sufficient to move the needle on our broken healthcare system. Beyond this idealized cycle, 2 other larger, overarching factors come into play:
Interoperability. In order for this idealized virtuous cycle to work effectively, it is crucial that the data and transactions can seamlessly interoperate with each sequential step. Well-governed data should be available for a machine learning platform without a massive ETL effort. Efficiently integrating data from workflows into data structures should be seamless, requiring minimal effort. The insights gained should be easily transformed into practical workflows.
If you have a third-party population health tool that creates fantastic insights but taking action requires manually transcribing medical record IDs into the EMR, you have added friction that will likely kill your project. In our complex healthcare system, a sophisticated tool that works really well on its own but is siloed from other parts of this cycle is almost worthless. A tool that is "good enough" but highly integrated into the rest of the healthcare ecosystem is orders of magnitude more valuable. (BTW, I think this is the key to Epic's success.)
Incentives. Finally, even if you create a highly tuned process of workflows generating insights that feeds back into workflows, the improved outcomes are only truly helpful if the process is nudged in the right direction by the overarching incentives. For instance, I think telemedicine's biggest potential is creating low-friction, high-frequency touch points for those patients who would benefit most. Getting easy access to Bactrim for a urinary infection or a Zpack for a cold (yes, I know that isn't medically appropriate, but don't get me started...) isn't going to transform healthcare. However, having a diabetic access a wound care nurse daily until she is able to expertly apply her own specialty dressing may prevent an amputation. Going over a recently discharged heart failure patient's complex medication changes after a hospitalization will likely prevent a 5 or 6-figure hospitalization. However, those benefits are not as obvious to decision-makers. The unfortunate truth is that the first question people usually ask is, "How does the digital solution help us make more money via fee-for-service revenue?" Cost reduction or quality improvement often takes a back seat to volume.
Fixing the problem of bad information and knowledge delivery is the most important thing we can do to fix our systemic healthcare issues. I am convinced that the information delivery issue is the best way to save our healthcare system and serve our patients. This will require focus on all four steps of the virtuous cycle: usability, governance, processing, and insights, along with seamless interoperability and the right incentives to drive our efforts in the right direction. If you look at any of these technology failures or examples of underperformance, you will find that the efforts have failed to address at least one of these 6 elements, usually more than one.
Consider these elements if you are a technology vendor or leading a health IT project. If you are trying to change and improve healthcare, you need to make sure all 6 are aligned. You will initially move slower, but there is a Navy SEAL saying that I think applies here: "Slow is smooth. Smooth is fast." It may seem frustratingly slow at first, but you will eventually move faster, and you will be more successful in the process.