Last year, I wrote a blog post Community Vital Signs Provides Additional Insights for Population Health Management. This year, the HIMSS17 conference made a big impression on me on the evolution of population health management processes and the creative synthesis of traditional and non-traditional data. A walk around the exhibition floor, and those educational sessions and exhibits around big data or cloud-based analytics, demonstrated the importance of population health analytics to the health IT community.
We have the access to infrastructure to build databases for conducting deep and sophisticated analytics. We have well-articulated processes to build an analytics capability (e.g. HIMSS Adoption Model for Analytics Maturity) and teams (e.g. upcoming HIMSS C&BI staffing and skills toolkit). More and more health systems are integrating rich datasets comprising of clinical, operational, patient-generated and publicly available health and community data. All of these are ingredients for generating meaningful insights from the data.
The logical next step would be the application of these insights to provide higher care at lower cost as defined by the Institute of Medicine. In a continuously learning health system, “science, informatics, incentives, and culture are aligned for continuous improvement and innovation, with best practices seamlessly embedded in the delivery process and new knowledge captured as an integral by-product of the delivery experience.”
Is it time, then, to assess how well we can apply the insights into practice? How would we measure them?
A recent publication of National Academy of Medicine (NAM) offers some recommendations and barriers, focused on how to integrate or embed research teams into health systems, so that they can seamlessly apply the research findings into operations.
Sure, there are initiatives led by Patient-Centered Outcomes Research Institute (PCORI) and others that attempt to do just that. However, these are more directed towards large studies or clinical trials, which typically take months or years to complete.
My experience is that there are no well-defined platforms or processes in healthcare that could take the findings (or even hypothesis) from the data analytics and test/prove them in the actual sub-population of interest before these findings are formalized or rolled out completely. One example that comes to mind is testing alternative approaches to reducing potentially ‘non-emergent’ ED visits – in other words, emergency room visits that are not truly for an emergency (e.g. by the uninsured, those without primary care providers, or for after-hours care).
I believe there is an opportunity to learn or adapt practices from other industries that have implemented these ‘rapid experimentation’ and learning cycles.
- Retail companies do these types of experimentations on the consumer response to new products, layouts, etc.
- In their book Hard Facts, Dangerous Half-Truths & Total Nonsense, two Stanford professors illustrate an example of this experiment-based approach at the entertainment company Harrah’s.
- Personal finance software company Intuit has a lab supporting rapid experimentation and even shares its methodology (Intuit Labs Next Tool).
Many health systems have adopted lean approaches to improve their operational processes. I think rapid experimentation would be another valuable tool to validate your insights or even hypotheses from analytics.
- How have you deployed (or considered using) lean or rapid experimentation process in your system? What was your experience? Share your thoughts with us.
Join the HIMSS C&BI Community today to learn more about applying data and analytics to improve health, and other critical topics that will help you on your journey to Turn Data to Action.