Five Model Practices for Leveraging Analytics to Inspire Clinician-Driven Quality Care Improvements

Clinician speaking with patient

Jonathan French, CPHIMS, SHIMSS

Access to clean, interoperable information enables fast and reliable data-driven decision-making to improve patient care and outcomes. However, this is only possible if those responsible for delivering care leverage that data to improve care processes.

Many large health systems have made significant investments into analytics functionality as a method for driving value-based care, reducing costs and improving quality care outcomes. Unfortunately, many organizations struggle to identify the best methodology for getting clinicians to utilize analytics data to drive quality improvement.

Using our Davies Award winners as models, we’ve identified five common practices used to get quality data into the hands of clinicians and make the data actionable to promote rapid cycle, clinician-driven quality care improvement projects.

1. Establish Consensus for Initial Clinical Care Model Practices and Measurement

The first step for an organization to develop a data-driven learning health system is to establish the analytics platform as the single source of truth around quality performance.

Selecting the appropriate measures is a critical first step for creating buy in to an analytics-driven learning health system. Demonstrating the value proposition of quality measurement and analytics tools to providers can initially focus on areas where compliance is necessary, including quality care measures which are reported to federal and state government agencies, accreditation bodies and value-based contracts. To enhance initial buy in, structuring the analytics model around the value proposition is recommended.

RELATED: Building a Learning Health System for Today and for the Future

Memorial Hermann Health System, a two-time HIMSS Davies Award winner, implemented a sepsis program built on a standard order set and decision support guidance. In addition to measuring for reporting compliance, Memorial Hermann’s informatics team developed a set of measures focused on three key questions built around the Triple Aim designed to support iterative changes in the sepsis order set.

  1. How are clinicians responding to alerts and/or notifications?
  2. How are clinicians using the tools?
  3. Is care being provided to our patients in a timely manner?

Memorial Hermann made the decision to measure order set compliance and other measures which would impact clinicians. For example, they measured the number of times alerts for sepsis fired to determine the risk of alert fatigue. The informatics team shared that the number of alerts dropped significantly as the order sets slowly changed the way clinicians delivered care.

In most cases, the initial investment in analytics should be focused on adherence to widely adopted model practices. By demonstrating a clear gap in care, clinical leadership can establish buy-in for measurement and corresponding workflow changes.

2. Plan for Resistance to Change

Overcoming resistance to long-held methods of care delivery can be the most significant barrier to creating a robust, analytics-driven learning health system.

Physicians heavily leverage their clinical intuition for decision-making. Reviewing performance data often demonstrates that intuition or long-held methods of care delivery do not produce significant quality care improvements, and often result in significantly higher costs.

RELATED: Process Improvement Should Be Led by People, Supported by Technology

Here’s an example. Peer-reviewed research demonstrated that prescribing bivalirudin, a blood thinner used to treat patients receiving a percutaneous coronary intervention (PCI), was 300 times more costly with no discernable improvement in quality to a less expensive alternative. Despite this, physicians at UNC Health continued to prescribe bivalirudin.

Why was there resistance to adopting an appropriate model practice of using heparin, especially in a value-based care environment?

Catheter laboratory physicians had to see data to affirm that the change would not put patients at risk. According to UNC Health, key components to success were 1) pilot a heparin-first strategy during PCI procedures and 2) leverage a multi-disciplinary team including physician champions to assess the change and share their findings with the rest of the enterprise.

Once the local data demonstrated heparin-first approaches were a safe model practice, UNC Health leadership launched an educational campaign indicating that prescribing heparin was the standard of care for PCI at UNC Health and that a PCI dashboard would be created to monitor compliance.

3. Get Clinicians On Board

Watch Davies recipient, talk with HIMSSTV about how it got all of its nursing and home care facilities documenting quality and utilization directly into its EHR to determine the best outcomes and give patients solutions to make care decisions.

There are several critical elements to gaining physician and other end-user buy in around quality care measures. Davies recipient, Open Door Family Medical Center, stated in their use case on colorectal cancer screening, “Nothing sinks a quality improvement project faster than having disengaged or mistrustful clinicians.” In order to build that trust, physicians and nurses must be confident that the selected measures are accurate reflections of the quality of care being delivered and that performance is attributed to the correct provider.

Data validation focused on the accuracy of the data and patient attribution – ensuring that a physician is not accountable for a patient when they are not the primary clinician responsible for the patient’s care – is a critical activity to establish buy in at the launch of an analytics-driven quality improvement project.

Several Davies recipients reported that challenges with patient attribution are a major barrier to success. Once Open Door validated their data and made clinicians a partner in that process and evaluation, clinicians “wanted that data and wanted to improve on their performance on those metrics,” shared Open Door chief medical officer, Darren Wu, MD. Once data was validated, Open Door established policies ensuring that providers were only accountable for the performance metrics associated with the patients on their panel.

Organizations varied with the methodology used to attribute patients. For Open Door, patients were attributed to their primary care provider (PCP). In order to be considered a patient’s PCP, the PCP had to see the patient at least twice within the last 12 months. Establishing a threshold where the organization can demonstrate repeated encounters with a single patient establishes credibility for the attribution methodology.

Second, required data elements for selected key performance indicators and measures must be accurately and efficiently gathered in the healthcare provider workflow. The EHR or other technology should use data elements already collected as part of the care process.

Typical EHR-enabled workflows utilize structured data fields for orders and clinical exclusion criteria. Making structured data fields easy to navigate is critical for success.

In addition, clinical quality and performance data must be deemed meaningful by clinicians. Clinicians must believe that the collected data will be used to identify gaps in care, conduct workflow analysis and root cause analysis for performance outcomes, and trigger change management to adjust workflows and best practice guidance that will drive improved outcomes. Selecting analytics tools which make data visual and meaningful to drive improvement is critical in encouraging providers to adopt new methods of delivering quality care.

4. Develop an Interface that Makes Data Actionable for Quality Care

Once a strategy for assessment has buy in from the clinical team, the organization must select or create the appropriate data visualization interface. This will enable the clinical team to leverage the data to address care gaps and improve care delivery.

RELATED: Time for Alignment: Bringing Healthcare Data, Analytics and Protection Together

Data visualization tools must deliver information in as close to real time as possible. Early interventions generate better patient outcomes and delays in getting performance data to providers eliminate windows of opportunity to identify and address outliers and quality care gaps.

Clinicians should be able to parse out data at the patient level for any patient where the measure is attributed to their care. This allows the clinician to identify any mistakes in patient attribution. This also allows clinicians to quickly identify gaps in care.

Davies organizations we see achieving the most significant improvements utilize technologies that promote a culture where clinicians access and review their own data on a regular basis.

5. Create Accountability by Removing Blindfolds

The opioid epidemic is one of the most significant public health crises facing North America. Multiple Davies recipients submitted use cases around the model practices for establishing improved opioid stewardship within their organization. Ochsner Health System incorporated the use of analytics to showcase prescribing data of opioids. Both identified the importance of making the data identifiable as a critical component of success.

Ochsner Health started their opioid stewardship program by creating a reporting process where their informatics team looked at the number of prescriptions being prescribed for opioids in emergency rooms. “We began this in a blinded fashion,” said Ochsner Health CMIO Todd Burstain.

After several months, Ochsner Health unblinded the data on the emergency department performance dashboards to show each individual provider compared to other providers for the number of prescriptions per day. Ochsner combined the dashboard with appropriate use guidance hard coded into the emergency department workflow, making it easy for a clinician to follow model practice.

Ochsner then spread the dashboard throughout the system and saw a 40 percent reduction in opioid scripts in the first year following the unblinding of the data. To date, 26,000 fewer Ochsner patients have been prescribed opioids. The averaging dosing strength of scripts which did meet appropriate use guidance also decreased significantly. Visualization of data changed practice.

RELATED: I’m in a PDMP State of Mind

The surprising benefit is that un-blinding the data promotes collaboration on identifying best practices across the enterprise. “Being able to share practice patterns and understand cost/quality among peers resulted in positive change in behavior,” said Joel Schneider, MD, FACC, an interventional cardiologist with UNC REX.

In the ambulatory space, successful population health management was largely driven by immediate access to unblinded performance data. Care teams in small, federally qualified health centers like Open Door and Petaluma leveraged their data visualization tools at the point of care to identify gaps and address during scheduled visits and by getting patients out of compliance in for a visit.

In larger health systems, data visualization drives both improved performance and helps direct patients to new services to address gaps in care and improve population health management. Ochsner Health ambulatory providers could see their own performance as well as the performance of their peers in both their own clinic and across the system. Data is presented graphically and is trended over time. Color-coding is used to make analysis of the data very easy: red means the provider is performing worse than the previous measurement period, yellow means they are not meeting goal, and green means they are meeting goal. Reds and yellows can have an impact on reimbursement.

Accountability is the top driver to drive adoption of the measures as the drivers for analytics driven iterative quality care improvement methodology. Regardless of setting, the attributed providers of care at organizations which have demonstrated the most significant improvements in quality of care have a clear understanding that their reimbursement, incentives and professional evaluations are directly tied to measurement.


Davies Award use cases reflect replicable model practices proven to improve both adherence to best clinical practices and improve patient care outcomes. Read previous recipients' stories.

The 2019 Davies application cycle is open through May 30. Learn more about the Davies Awards program and how to apply.