Skip to main content

Cloud

Implementing and Extending Epic’s Cogito

Introduction
Over the years as a consultant in the IT field, I’ve worked on quite a few health care related projects.  None have been as large or interesting as my current engagement.
Implementing and Extending Epic’s Cogito for Pro HealthI am part of a team working with ProHealth to stand up and extend a new third party data warehouse offering called Cogito.  We have delivered Phase I of our project and are working on scoping out Phase II right now.  Additionally, our work was recently spotlighted for making ProHealth the first enterprise to deliver production reports via the Cogito platform (http://www.bizjournals.com/milwaukee/news/2013/11/25/prohealth-adds-epic-population.html).
This blog post will talk about Cogito and some of the things we’ve learned bringing Phase I to completion.
What is Cogito?
Cogito is a new data warehouse offering from Epic that is part of a suite of healthcare-focused applications.  It offers health care systems and their providers a central data repository for their clinical, financial and business data.   Out of the box, Cogito contains over 100 fact and dimension tables and the ETL packages to populate them – all specifically tailored to the health care industry.
Initial Business Case
ProHealth wanted to implement a series of new metrics and reports to facilitate physician “pay for performance”.  This meant integrating data from many different subject areas, like claims, billing transactions, clinical outcomes, physicians and patient surgical histories.  The data domain spanned several different data sources from within the enterprise.
ProHealth decided to purchase the new Cogito data warehouse to accelerate their growing data and reporting needs.  Several reasons drove that decision: their existing relationship with Epic, their need for a central “hub” for their changing data requirements, and Cogito’s reliance on the Clarity application – which they were already using in production.  All of these factors led to Cogito become the solution of choice.

  
System Overview
ProHealth has invested in the latest Microsoft technologies including SQL Server and SharePoint, so Cogito is running on premise on the Microsoft SQL Server 2012 database platform.
Our Phase I deliverables included the Cogito data warehouse, an extended data model (more about this in a bit), an SSIS solution to load the custom extension tables, more than 50 complex physician care KPIs, and several SSRS dash board reports delivered via SharePoint 2013.
Best Practices from Initial Implementation
Out of the box, Cogito implements many good industry best practices, including but not limited to:

  • Fairly strict adherence to a star schema with the dimensional model.
  • Good documentation: an easy to read data model and informative data dictionary.
  • Clear and consistent naming conventions.
  • Job-run auditing, with results accessible via a web-based portal.

Extending Cogito
As noted above, we were implementing many new complex metrics.  While Cogito contained a large portion of the data required (via the Clarity source), we still had to bring in data from several other sources.  In some cases, the data from non-Epic systems integrated rather easily via an import process built into Cogito.  But in other cases, the data needed for Phase I reporting did not fit nicely into the out-of-the-box data model.  To support the business needs, we had to build custom data structures within the Cogito DW and the ETL packages to load them.
Extension Approach and Lessons Learned
Because modifying out-of-the-box Cogito database structures (tables, views, etc.) is not supported, we had to choose how we were going to implement additions to the database.
Our basic approach was to use database schemas to segment custom tables from the out-of-the-box objects.  Epic used DBO and a few other schemas, and we wanted to isolate our structures from theirs, while still residing in the same database.  This gave us several advantages:

  • Isolating our extension tables and views in custom schemas will protect us during future upgrades.
  • We can apply database security to all of our tables in one place instead of object by object, without affecting the out-of-the-box structures.
  • It makes identification of core versus custom objects very simple, which makes transition and maintenance easier.

We also designed a small reporting data-mart to derive and store our physician KPIs.  This logic was complex enough that we had to push off the heavy lifting into the ETL jobs.  This facilitated the rapid reporting the business required.  In the future, OLAP and tabular Analysis Services will most likely also be built on top of the Cogito platform as well.
While working through design and development, we learned many other useful tidbits for working with Cogito, including:

  • Epic does not support the creation of FK constraints and new indexes on core tables, however their indexing strategy is sufficient in most case.
  • We were able to use table constraints and indexes on any of the objects we built in separate schemas.
  • While Cogito’s ETL engine does support the addition of custom packages to their master jobs, it didn’t make sense for us to utilize this feature.  The way they have it designed, custom packages can be attached to the jobs for specific tables.  But we needed a way to run many packages after all of the core tables were done.  So we built an entire SSIS solution and made it a completely separate job in SQL Server.
  • Timing of the jobs that load Clarity, Cogito and the custom tables is critical.  Clarity and Cogito both have substantial run times, and we had to get creative to make sure everything executed in the proper sequence and on time.

Summary
Cogito has proven so far to be a good foundation for ProHealth’s data warehousing needs.  Implementing it on the Microsoft SQL Server platform gives us the flexibility to easily extend and scale as needed.
Special thanks to Greg Seemann for contributing to this post, and for being a key contributor to the Perficient/ProHealth team!  Greg is a Lead Technical Consultant with Perficient.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Follow Us
TwitterLinkedinFacebookYoutubeInstagram