Skip to main content

Cloud

Transforming the Patient Experience with Epic, BI, and PressGaney

During my project over the last 6 months, I have spent my time developing two BI Solutions for ProHealth Care in Waukesha, WI. ProHealth Care is a health care organization that is using an Epic Cogito data warehouse on a Microsoft SQL server 2012 database. Over the last year we had an aggressive project schedule to bring this warehouse online and integrate reporting within a Microsoft SharePoint 2013 BI center.  This would be the 2nd phase of this year-long project. On deck was the task of improving workflow through the Patient Experience program. The Patient Experience program is an internal program common to any healthcare system that focuses on patient satisfaction and quality. Surveys are a key component of the data gathering processes many healthcare institutions use to manage patient experience.
Title Text (1)Press Ganey is a survey company that provides survey results and statistics for ~ 10,000 healthcare providers according to their website. By providing surveys and data services to a large number of organizations, they are able to compile a national database of questions, answers, and statistics of patient satisfaction. This data is used to evaluate any participating organization on their current performance with a percentile ranking against every other facility. The results from the surveys are an industry benchmark against which many organizations measure their level of service.
Needless to say, it is critical information for any health care organization that wants to improve their overall quality and performance levels.   Our goal was to automate, integrate, and to distribute two data feeds from Press Ganey via SharePoint BI Center, keeping a self-service model as a design goal, all within in a 6 month time period. No problem.
The problem: The problem ProHealth Care was having could probably best be summed up as “data overload”. A lot of hours were being spent every month downloading reports from a website and then creating a multitude of excel worksheets to do the number crunching in order to calculate high level performance metrics.
Another problem was that the data was being stored in an unstructured format. There was no simplistic way one could relate the data results from the top level “score card” numbers back to the source data. Who was the doctor, where and what time did this patient visit, what was their diagnosis-all questions any person in charge of making sure patients are happy with their visit would be interested in.

Press Ganey offers its clients two different types of data feeds
1. A standard XML export that they provide free of charge. The XML data export provides a wealth of information including patient sentiment results and actual patient comments. It also includes the questions that were on each patient survey, and of course the numeric (1-5) answers as is Press Ganey’ s format.
2. What is missing from the XML export are the Ranking, Sample Size, and Mean scores that Press Ganey provides in their online reports. For this data, you will need to use the custom data feed service. It’s exactly as its name implies, a data export service where the client can define what data and what time periods they are interested in receiving.
We chose two different approaches to deal with the two different Press Ganey data feeds.
The XML Export. For this solution we decided to import directly into the existing data warehouse. Cogito does provide a basic module for survey data, but it was necessary to extend it and enhance it quite significantly to meet our needs. Several new dimensions were created, and one new fact table was as well, but the basic Cogito model provided a starting point handling the basics of the Survey and Survey question entities.
We developed two cubes from the end data models. SSAS was chosen for this solution so that we could aggregate totals, do AVG’s and SUM’s easily, and provided for future growth of the data. I believe the scalability of an OLAP cube outweighed a tabular model for this case as it has the potential to eventually contain a lot of data.
So on to the second solution-the Custom Data Feed. This is a paid service that Press Ganey provides to their customers. Pro Health is interested in knowing how they are performing on a national level and how they rank versus every other hospital providing the same type of services. In order to see this, they receive RANK, Mean, and Sample Size scores from Press Ganey every month. This information was being provided via an HTML reporting application, but we needed an automated integration to solve the problem of needing to manually download and process this data every month.
This solution ended up being a lot easier to build in terms of the source data, it is a simple CSV file containing exactly what you need. Nothing too fancy in terms of transforming the data, basically just scoop it up and bring it into a data model. This data model did not have any equivalent in the Cogito product, but we were able to leverage some standard dimensions such as DateDim, SurveyDim, QuestionDim, etc. The data part of this solution was nothing new to a decent ETL developer, the interesting part of this solution was the Power Pivot and Power View model that was rendered in Microsoft’s SharePoint BI Center.
The Power Pivot/Power View combo proved very effective for this particular set of data. We created one Power Pivot model and Several Power Pivot views of this data. No more digging through directories filled with excel worksheet snapshots from all of the past months and years. A few scorecards, trend lines, and bar charts that could be filtered replaced many of them. It is now quite easy to filter on any time period the end user wants and then to see the results.
Power View is a wonderful tool. It seems like every person you hand it to will create something different, even when using the same set of data. It really does allow you to use your imagination a bit and be creative with an underlying data set, while still remaining user friendly to someone that is not a programmer. Its learning curve is not steep at all for someone familiar with basic pivot table/Excel charts functionality.
Both solutions ended up being distributed in Microsoft’s BI center on SharePoint 2013. SharePoint provides a great portal and management interface to provide access to the SSAS cubes, the Power Pivot and Power View objects, and an easy way to share the information throughout the organization. Developing the BI Center was sort of the side project that ran parallel to several lines of work in this phase of the project. ProHealth Care’s web team did a fantastic job of getting the BI Center up and running and developed. Don’t underestimate the SharePoint developer involvement that is necessary to get all of the content secured, distributed, and presented in a usable manner.
One of the major difficulties we encountered in this project was simply just agreeing on which data we would receive in the custom data feed. This wasn’t so much a technical problem, but just a matter of discussing, agreeing (or disagreeing), and of course budgeting for what we were going to buy.
The feed itself worked great thanks to some very helpful and competent admins at ProHealth Care and Press Ganey (Hi!), but it would have been nice to maybe use a web service, an Azure market place feed, or something a little bit more advanced than just an FTP inbox/outbox type architecture. I will look into this in the future for any other Press Ganey integrations.
The possibilities from the XML data from press Ganey are many-especially if you have the back end data warehouse to tie this information to. The XML file is a bit difficult to deal with as there is no XSD associated with it, but Press Ganey does provide an import script example that will get you most of the way there. Be ready for a few SQL XML shredders and some XQuery! Not the easiest file to work with, but it’s what’s inside that counts. As a free export, it can add a tremendous value to any healthcare organization that wants to drill down and dig into their Press Ganey Data.
One major potential “gotcha” with the XML file, is that without a data warehouse to relate this file to, you would have to do a lot of extra work building dimensions. In this case, the export from Press Ganey contained foreign keys to Cogito, this was a very important link between the XML export and Cogito. This allowed us to relate the survey data to just about anything that was in the data warehouse. Without this link the integration would have been a lot more difficult.
In summary, I was very happy with the end result. I feel going forward that ProHealth Care now has a powerful tool to help them manage and investigate their Patient Experience program. The end product consisting of two SSAS cubes, and a Power Pivot/ Power View model, will provide a lot of information to the end users via the Microsoft Self Service BI model. Any organization that is using Press Ganey survey information would benefit from a project like this.
 

Thoughts on “Transforming the Patient Experience with Epic, BI, and PressGaney”

  1. Very cool. I will begin to develop my cube this week as I just finished automating the XML feed into my SQL DB. Any helpful hints would be greatly appreciated! Have a good one!

  2. Thanks for writing this up. Very Interesting!!
    We have been starting to think about how we could pull and use the PressGaney data differently.

  3. Does anybody know if it is possible to obtain a patient key in the data export such that the hospital can connect back to the actual patient?

  4. If you are using Cogitio, then I believe you should be able to obtain it from BillingTransactionfact. That is where I obtained foreign keys to most of the rest of the warehouse. We were sending the FK from PG in our initial export from Clarity, so when we received the XML data from PG, it was a natural key/business key in the Cogito table BillingTransactionfact. From there you can just link directly to patientDim. Hope that helps!

  5. My two biggest tips would be to leverage as much as you can from BillingTransactionfact-it has FK’s to almost the entire rest of the warehouse. My other tip would be to read up on the XML shredding process. If it’s not something you do on a daily basis, it can be a bit of a time sink trying to learn. PG will provide you with a basic script, I turned it into a SP and then used ~5 staging tables to do the initial “XMLStaging”. Once it is in relational form, it is a lot easier to work with.

  6. Thanks for the input!!
    What sort of documentation is available for the PG XML format and process? I can’t find anything. Do I need to be an active customer of theirs to get access to documentation?

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Greg Seemann

More from this Author

Follow Us