In our academic medical center, where the component activities of analytics are carried out by different groups across the enterprise, our aim in governing these activities is tripartite. Prioritization ensures that the analytics or data objects built are what users actually need; reuse promotes scalability and efficiency; and standards serve to support reuse and maintain consistency across teams. Enhanced inter-team collaboration yields such desirable effects as reducing reporting of different answers to the same operational or strategic questions, and clearer provenance of metric components, for example.
This distributed analytics environment requires what Weill and Ross1 term “federal” governance, a hierarchical model with some central and some distributed elements. Centrally in our Analytics Center of Excellence, we rely on a tiered structure of stakeholder groups to prioritize analytics requests nominated by the stakeholders themselves. Stakeholders are grouped roughly by clinical service line within the health system, with single stakeholder groups serving finance operations and our research community.
We rely on this governance structure as a rate-limiter, ensuring that analyst work queues are full but not distractingly over-full. Providing transparency into each request’s development progress helps to ensure that stakeholders clearly understand capacity, further reinforcing the importance of their prioritization. That transparency, which comes in the form of real-time status updates on the request intranet portal, also reassures the user community that the effort they are funding (ACE operates on a cost-recovery basis) is aligned with their strategic goals.
In the distributed analytics teams outside ACE, where embedding analysts in the business workflow is paramount, resources may be assigned directly to specific projects—a flatter governance model. The downside to distributing analytics talent throughout the organization includes reduced communication and attendant promulgation of redundant data objects and discrepant metric definitions. To address these sub-optimizations, some department-dedicated teams are housed within ACE, and standards and best practices are also promulgated from the center. The distributed teams participate in development and observance of standards in exchange for ongoing access to enterprise data and leveraging of shared technology. The standards are primarily focused on documentation of work and reuse of solutions already built. While neither of these aims come naturally to most analysts, we have found that resources with readily-available, clear documentation are more often reused. The payoff for analysts is increased time available to solve new problems that would otherwise have been wasted reinventing extant solutions.
After moving from a host of best-of-breed systems to a common EHR, our organization struggled for years to recover the alignment of analytics development with operational and strategic priorities. When an overarching, univocal approach to analytics governance failed to take root, we found ourselves at risk of becoming an organization with a shared medical record environment but fragmented and stove-piped analytics and data management. The hybrid analytics governance described here ensures that our clinical data are efficiently leveraged from point of care through clinical decision support or research finding.
1Weill, Peter; Ross, Jeanne W.. IT Governance: How Top Performers Manage IT Decision Rights for Superior Results. Harvard Business Review Press. 2004.
About the author: Leading the Enterprise Data Warehouse, Business Intelligence, Clinical and Operational Reporting, and Analytics Support for Clinical and Translational Research, Dr. Blackwelder’s role at Duke is all about transforming data into actionable knowledge.