Good Read: What’s Your Data Strategy?

Leandro DalleMule and Thomas H. Davenport, Review By: Angela Fieler

How do you define a “good read?” Does the material have to be engaging or entertaining? Do you have to feel compelled to do something with the ideas presented? Do you have to learn something new or does thought-provoking material fit the bill? I am an action oriented person and it has become my habit to ask, “How can I put this information to use in my professional or personal life?” If I can’t find a practical use for what I’ve read, I tend to move on to something else, and not give the material another thought.

I was drawn to an article in the May-June 2017 Harvard Business Review because of the title, “What’s Your Data Strategy?” by Leandro DalleMule and Thomas H. Davenport, for no other reason than my personal, lifetime love of data analytics. I read the brief and decided to proceed with caution. The authors promised to provide a framework that addresses two key issues: clarifying the purpose of the data and providing guidance in strategic data management. The promise was interesting enough to keep me reading, but left me with some uncertainty regarding my “practical use” test.

I have since read the article at least five times. Why? Because it was so thought provoking, that it caused me to alter my own definition of a “good read.” While the material itself did not contain any real practical applications for healthcare, the concepts presented made me think about the relationship between data, data management, and data analytics in a new way. Two startling facts presented by the authors really got me thinking: On average, less than half of an organization’s structured data is actively used in making decisions, and 80% of analysts’ time is spent simply discovering and preparing data. I have spent a good portion of the last fifteen years of my career playing with patient survey data, so the more I thought about these two facts, the more I kept coming back to what I’ve done and seen done with patient survey data. How much energy is your organization spending on what can only be described as spitting out graphs and tables from your patient survey vendor reporting tools? How much time do you spend improving the appearance of those graphs and tables rather than thinking about or discussing what the numbers are telling you? How many times do you show the same graphs and tables to different groups with different interests and different purposes? Do you spend more time discussing the validity of the data or the message the data is conveying? At the end of the day, what is anyone really doing with the feedback you are receiving from your customers?

When I think about the framework presented in light of these questions, I can see some basic, practical applications to what I read. The framework basically says that you start with a single source of data and from that you develop multiple versions of the truth. That sounds terrible, I know – the old adage, “Figures lie and liars figure” comes to mind. The authors are not suggesting that you manipulate the data to stretch the truth. Instead, they are saying that data can be used for multiple purposes and that, once you have a single source of truth (well-governed, standardized, valid data), you can organize that data in multiple ways to serve multiple purposes.  How does this framework help us make better use of patient survey data? Here are my thoughts:

Create your own operational definition of your single source of truth when it comes to patient survey data. I would recommend that at a minimum, you should include all survey scores (that is, all question scores and all composite scores as either a mean score or a top box rating, however the vendor reports the results) at the organization level and at the patient type level (e.g., inpatient, emergency, outpatient, physician office, etc.), by discharge month, using only complete data (that is, monthly data points where the preponderance of likely returns are already in). Some organizations might opt to include percentile rank in addition to scores.

Put together a small group of patient survey data experts and make them responsible for creating your single source of truth. These folks don’t necessarily have to be formally trained in data analytics or be dedicated to this work on a full time basis. What matters most is that they have in depth knowledge of your patient survey process and the inner workings of your data collection process and reporting tools. In addition to creating the single source of truth, this group should be identifying positive and negative trends, as well as recommending areas of focus for your improvement efforts. At Baird Group, we call this group the Measurement Team.

Assign one person the responsibility to identify what groups will benefit from looking at detailed patient survey data and what data would benefit that group the most. For example, your Medical Executive Committee might look at composite and question level data related to physician or provider interactions with patients and family members and then charter an improvement team to focus on one aspect of the patient experience that the data suggests needs improvement.  If your Quality Committee is tasked with managing improvement efforts, that group should be looking at composite or question level data that is associated with any improvement teams you have chartered. These various reports make up your multiple versions of the truth.

Set the expectation that each manager will pull their own version of the truth from the portal and that the data be used to drive improvements there as well. Train your managers on how to pull the data and then create accountability mechanisms to ensure that they are not only pulling the data, but sharing the results, and driving change. 

The important point is to use the data your survey vendor is providing. I have experience using the reporting portals for all the major survey vendors and they all have the ability to produce trending charts over time by month. You don’t need a degree or fancy color coding to look at 12 consecutive months of data to see if scores are going up, going down, or pretty much staying the same. Use your resources wisely but putting your energy into actually using more of your patient survey data across your organization to drive real improvement in the patient experience.  To learn more about the Baird Model for Service Excellence, employee engagement or leadershipdevelopment workshops, or to sign up for her FREE newsletter, write to info@baird-group.com.

FacebookXPinterestLinkedIn