Big Data and the Humans That Use It

At the heart of the University of Texas Health Intelligence Platform (UT-HIP) are two things: data and conversation.

Asset 11.png
 

By Daniel Oppenheimer
Editor, Texas Health Journal

 
 

At the heart of the University of Texas Health Intelligence Platform (UT-HIP) are two things: data and conversation. Every two weeks, the Chief Quality Officers from the University of Texas health and medical institutions get on a conference call, or gather in a conference room in Austin, and do what quality officers do. They look at the data from their hospitals to see how they’re doing, and they talk about ways to do things better.

Why are heart failure readmission rates lower, for instance, at The University of Texas Medical Branch in Galveston (UTMB) than at UT Southwestern in Dallas, or vice versa? If the rates have gone way down at UTHealth in Houston, why? Can other institutions replicate what’s gone right there? How do they compare to other university health systems around the country?

Then they talk again two weeks later, and two weeks after that, and so on. Parallel to these calls, the clinical and informatics leaders from the institutions are talking as well. They’re looking at health data at a more granular level, using electronic medical records.

To round it out, the folks from each of the institutions who deal with program governance―with how to legally, and effectively, share private health data across institutions―are meeting too. They’re going through the same kind of process, but measure progress in contracts signed, interoperabilities achieved, privacy and security milestones met. They’re making it possible, as they go, for the quality, clinical and informatics leaders to share more data without compromising patient privacy.

Flowing through all of these meetings is a kind of ongoing seminar in medical innovation, with presentations from faculty at the institutions who are finding new ways to improve care for their patients and lower costs for their hospitals.

The goal of all of this is simple: build trust and use big data to improve health throughout the system. It’s a simple goal. Getting there is not simple.

"Historically, the health institutions of the UT System have operated as linked, but operationally independent organizations,” says Dr. Ray Greenberg, Executive Vice Chancellor of Health for UT System. “Over the fast few years, under the leadership of the six health presidents, we have made tremendous progress in operating in a more coordinated and integrated manner. An essential element for true joint effort is the ability to share information and use it to guide decision-making and evaluation. This is the role that UT-HIP can play."

With that objective in mind, the UT Board of Regents allocated $12.4 million to develop a comprehensive platform to better understand, and derive insight from, the more than 6.78 million outpatient visits and 1.38 million inpatient hospital days that University of Texas System-owned and affiliated hospitals and clinics handle annually.

 “The UT System has a wealth of data,” says ShuRon Green, Executive Director of UT-HIP. “Technology solutions are available to support programs like UT-HIP and these platforms show continued advancements. The challenge is how to leverage the data and technology to drive improvement.”

The Human Challenge

When Zain Kazmi joined UT System in 2016, to drive the development of UT-HIP, his goal was to think as carefully as possible both about the possibilities for using data to drive improvement and about the ways in which such efforts can go wrong.

“Often what’s most complex is what happens after you find the relevant, interpretable data, when you’re trying to deploy the data to drive improvement on the ground,” says Kazmi, Assistant Vice Chancellor for Health Analytics and Chief Analytics Officer for UT System.

One of the biggest mistakes that big data enthusiasts can make, says Kazmi, is focusing too exclusively on the technical challenges of integrating and analyzing data, ignoring the human challenges that face any large organization trying to change. That human and organizational complexity is even greater in the case of an institution like UT System, which is more of a federated group of mostly autonomous institutions than a truly centralized, single entity.

What has emerged from the planning process, which Kazmi and Greenberg have led along with an executive steering committee of representatives from each of the six health institutions and two medical schools, is an approach that focuses as much on the people and institutions that are being asked to integrate as it does on the data itself.

The three subcommittees, each with representation from each of the institutions, are one part of that approach, along with the executive steering committee. They’re helping to define the problems, set the goals, identify the lowest hanging fruit for improvement, finesse the differences, and advocate for solutions within each of their organizations.

The other part of the job is developing the technical infrastructure for meaningfully sharing and comparing data across institutions. That has run along two tracks, both of them deeply informed by the committees and the conversations they’ve been having.

The quality data from the institutions is being organized and compared through Vizient Clinical Data Base, an analytics platform that was already being used by all but two of the UT institutions prior to UT-HIP. It allows for monthly tracking of not just health-related measures like mortality, length of stay, complication and readmission rates, and hospital-acquired conditions, but cost and efficiency measures as well.

Data from electronic medical records will be analyzed and compared through Epic Constellation, software dedicated to aggregating information from health records from multiple organizations.

The choice to adapt existing commercial products, says Kazmi, was a pivotal one.

“We could have created a system from whole cloth, but it didn’t make sense when our institutions were already, in most cases, using both of these platforms,” says Kazmi. “The smarter play was to take solutions that were already working, for individual institutions, and find ways to make them work for us as a collective.”

The Vizient Way

The thing to understand, says Dr. Bob Murphy, is that academic medical centers are not like the average hospital.

“We provide safety net care for our communities,” says Murphy, Associate Dean for Applied Informatics at The University of Texas Health Science Center at Houston (UTHealth). “We are also at the cutting edge of medical research and care. So we treat the sickest patients, but we are doing so with the best providers and the most advanced technology in the state and nation. It is not easy to compare our data to the average hospital.”

It was substantially in response to that fact that the University HealthSystem Consortium, which was recently folded into Vizient, came into existence. It was developed by the nation’s leading academic medical centers in order to allow each institution to track its quality metrics, and progress, against its peers. Each institution that is part of Vizient is responsible for entering data on key quality measures. In return, each institution is provided tools both to track its own outcomes over time and to compare its performance to that of analogous institutions.

What UT-HIP has brought to the table, working with Vizient, is an even more refined platform for comparison. In addition to the traditional dashboard, which enables the user to see only a limited amount of data from its peers, each UT institution can now access a systemwide dashboard, which provides greater depth of detail from its fellow UT institutions.

“We can also provide national benchmarks for each of our institutions,” says Murphy, who leads the Vizient project. “So that might include comparisons between MD Anderson and other cancer hospitals, or between Level 1 Trauma centers with more than 1 helicopter, like Memorial Hermann, or pulmonary transplant centers like UT Southwestern. We can benchmark the whole system against the University of California system in terms of size.”

What Murphy and the chief quality officers have decided to do with this capacity is launch a series of six-week, rapid cycle collaboratives, in which they take a specific measure from the quality data, analyze the data across institutions for opportunities for improvement, hear from the institutions that are doing it best, and then implement new practices when appropriate. So far, they have looked at patterns in post-surgical venous thromboembolisms, overall mortality, surgical site infections from colon surgeries, and heart failure readmission rates.

The UT-HIP staff supports these efforts by analyzing the data, developing custom visualizations, continually refining the dashboards, and presenting the findings to the quality team.

“The institutions are already engaged in data-driven quality improvement projects on their own,” says Murphy. “Each institution is doing many things well, and each has opportunities for improvement. Our goal is to support them as they set their priorities. And what we have learned is that by working together, it helps each of us individually. We learn from our colleagues. We are exposed to new methods and practices. We push each other. And the data is enabling us to build a longer term, more coherent, shared narrative of what is happening at each institution and across the system.”

The Walk of Analytic Maturity

Early in his career as an informatics guy, Dr. John Frenzel ran an experiment on his fellow anesthesiologists.

He began by giving a lecture to his colleagues on the best practices for reducing the risk of post-operative nausea and vomiting. He also told them he’d be tracking their compliance with best practices going forward. Over the next few months that led to what’s called the “Hawthorne effect”: an initial change in behavior, followed by a fairly quick return to the baseline.

Six months after the original lecture, he gave another presentation on the treatment of post-operative nausea, and provided each of his subjects a report card on their individual degree of compliance. Then he watched the data again. It led to a modest, sustained increase, from about 50 to 55 percent of doctors implementing best practices.

Then, six months after that, he and his team gave each doctor a different report card. This one ranked their performance relative to their colleagues in the department.  The new report came out quarterly.

“We weren’t publicly shaming them,” says Frenzel. “No one else in the department was seeing anyone’s ranking other than their own. It was a blinded report so they could see how they ranked versus the rest of the providers.  Even so, the impact on their compliance was profound. We went from 55 percent compliance to 90 percent.”

For Frenzel, it was an example of the power of data to influence behavior, and also the necessity of the right kind of data, delivered in the right way, over a sustained period of time. He believes this is the key to what UT-HIP is already doing right, and what the project will have to continue to do, over the long term, to see meaningful improvements.

“From the time an improvement is first introduced into clinical practice to the time it is widely adopted in clinical practice is usually a generation of training,” says Frenzel. “That’s not a data problem. That’s a human problem. The data gives people a focus. It helps bring them together around solving common problems, and then provides continual feedback on progress.”

The vehicle that Frenzel’s group is deploying to facilitate this process is Epic Constellation, which is data management and analytic software from Epic, one of the two dominant players in the electronic health record (EHR) market.

UT-HIP chose this software, says Frenzel, for two primary reasons. One is that because of its history as a privately-owned company, with a culture of internal software development and rapid innovation, Epic has a single, unified code base for all its software. The other is that four of the six UT health institutions were already using Epic software for their electronic health records when UT-HIP was launched, with another already planning to switch over to it.

This means that some of the most profound challenges facing fragmented systems trying to unify their data management and analysis are already solved. The individual institutions already have experience navigating Epic software. They are already using standardized definitions for data elements because they are standardized in Epic, so the comparisons between data from different institutions will be apples-to-apples. And because Epic Constellation was created from the same code base as the EHR software the institutions were already using, the importation of the records from multiple institutions into one single warehouse is technically fairly straightforward.

“We are going to extract between a million and a million and a half patients records out of the systems into Constellation,” says Frenzel. “That means 450 data elements across a million and a half patients. The potential is extraordinary.”

Frenzel anticipates that the Epic Constellation database will be populated with that data by the spring of 2019. Then his team will go to work analyzing it.

In the interim, he and his team will cut their teeth on a big, but not quite as big, data set from UT Select, which is the health plan for all employees of all the UT System institutions. They will help build the software to enable the Office of Employee Benefits, which manages the plan, to analyze claims files, totaling more than $1 billion in expenditures, from the roughly 250,000 members of the plan. This will be in the form of a series of dashboards to better understand their own population of members. It will leverage cutting edge tools to provide in-depth analysis and visualization of the claims history.

The hope with the UT Select data, as with the EHR data from the institutions, is that it will anchor what Frenzel likes to call “the walk of analytic maturity.”

“You use the data to bring people together,” he explains. “Through conversation you decide what you want to measure, and then you can figure out how to integrate data to produce that measure. Then you show people the measures and follow from there. I call it the walk of analytic maturity. You have to walk with your partners. As they mature in their understanding of data, you mature in your understanding of their needs.”

For Greenberg, the hope is not just that this process will occur in the context of UT-HIP, but that success with UT-HIP will help build and sustain a larger process of collaboration and coordination across the system.

“This is an opportunity for us to lead the way, in Texas and nationally, in the use of big data,” says Greenberg. “It is also an opportunity for us to evolve as a system, and to lay the groundwork of trust that will enable future collaboratives that may have nothing to do with data.”