Let's face it, we don't have meaningful widespread data system interoperability yet, but rather than place blame on providers, vendors or the government, let’s take a closer look at the problem.
The mission of interoperability is to make data from electronic health records (EHRs), doctor notes and other relevant patient data sources available in systems and contexts that are different from the original source. The expectation is that interoperability will enable efficient care via information sharing and a reduced need for duplicate tests. This will ultimately improve health outcomes and drive down costs.
However, efforts to achieve interoperability have not yet found widespread success. Interoperability is not in and of itself a hard technical problem to solve. Other industries, notably financial and logistics, have interconnected systems that work. So why have we struggled so much with achieving interoperability in healthcare?
Avoiding the “Field of Dreams” Scenario
First, let's dispel the notion that interoperability is currently nonexistent. There is some interoperability, but it’s narrowly confined to managing adjudicated claims streams. Other initiatives, like Health Information Exchange (HIE), portals and the migration of patient data across systems have been far less fruitful when you consider their widespread benefits and adoption. In these cases, interoperability has been pursued more as a theoretical benefit than a solution to a specific use case with far-reaching demand.
This is a classic “field of dreams” scenario, where we’re trying to build a product that lacks a defined user or widespread demand and doesn’t clearly fit into existing workflows. Defining a set of standards and hoping for widespread implementation is another form of this thinking. Material demand for intersystems interoperability needs to drive standards and technologies. We have to expand our thinking beyond “one-size fits all” to solve these challenges.
There are noteworthy interoperability successes in healthcare. For example, OCHIN, an Oregon-based nonprofit, recently launched a real-time data aggregation system that serves more than 170 organizations and boasts interoperability between different implementations of the same electronic health record (EHR). Another good example is how users of Epic EHR systems can request patient charts from other institutions that use Epic. Note the use-cases. In the first case the system is a locally brewed solution created by like-minded providers and the second is available to users within a vendor’s ecosystem.
How Do We Improve Interoperability?
The best way to achieve interoperability and realize the benefits is to approach the problem one use case at a time and pursue more targeted solutions.
Do we want practitioners of a certain specialty like oncology to share in a common pool of knowledge countrywide? Do we need clinical data shared with researchers? Do we need to pull data for analytics or reporting? Do we want to meaningfully migrate a chart between two EHR systems? These are the use cases that will drive the technology and solutions that are ultimately built and adopted.
We need to let go of the build-it-and-they-will-come approach where interoperability is an inherent good that will inevitably confer benefits if everyone would just play the game fairly. The benefits of this top-down approach are too diffuse and long-term for the market to cooperate in the near future.
Instead, we must start at the grassroots market level and look for the specific use cases to solve. Doing so will do more to define interoperability standards going forward than any single effort.
About the author: John Schneider has over 25 years of software technology and product development, managing research, and product and engineering experiences. Prior to joining Apixio, John was Chief Product Architect at Apollo Group and co-founder/CTO at CloudTalk.com.