Perficient Healtchare Solutions Blog

Subscribe via Email

Subscribe to RSS feed

Archives

Follow our Healthcare Technology board on Pinterest

Posts Tagged ‘HL7’

It is time for Interoperability to catch fire! (FHIR® that is)

One of my healthcare consulting friends once said that interoperability was difficult because healthcare was interactional, not transactional.  The interactive nature of the healthcare organization and the patient foretells the complexity of integrating and sharing information that is so critical to reducing costs, increasing patient safety and streamlining productivity.  The challenge is inertia – we have so many healthcare applications and integration engines that are stuck on the older HL7 version 2.x as the means of implementing interoperability.  The uptake of the proposed HL7 version 3.0 has been very, very slow due to support from the EMR vendors that are focused on bigger problems like Meaningful Use or ICD-10 support.  In response, HL7 is hoping to set interoperability on fire with a new approach called FHIR® – Fast Health FHIRInteroperable Resources – it is a next generation standards framework created by HL7. FHIR combines the best features of HL7’s Version 2, Version 3 and CDA® product lines while leveraging the latest web standards and making sure that easy implementations are a top priority.

The key to the fast implementation speed of FHIR® is flexibility!  FHIR solutions are built from a set of modular components called “Resources.”  These resources can easily be assembled into working systems that solve real world clinical and administrative problems quickly and with a minimal amount of development, sometimes only one day. FHIR is designed to meet modern integration demands in a wide variety of contexts – social media on mobile phones, cloud communications, EHR-based data sharing, server communication in large institutional healthcare providers, and many other scenarios.

Read the rest of this post »

Quality Reporting Data Architecture (QRDA) Primer

One of the key ways to improve productivity in healthcare is to become more efficient at interoperability within a healthcare organization and between healthcare organizations.  Sharing quality reporting results is a good example of a healthcare area faced with challenges in interoperability and efficiency. According to the Agency for Healthcare Research & Quality, the United States healthcare system has challenges with using data for Quality Performance Measurement including, but not limited to:

  • Time-consuming and problematic operations for data acquisition from electronic systems
  • Multiple and disparate systems within health care organizations complicate data mining and coordination of efforts
  • Resource-intensive data mapping efforts to link systems and performance measurement data requirements
  • Conflicts or differences between administrative data sets
  • Physicians and providers struggle to meet increasing demands for performance data

A solution is in sight to improve or eliminate these problems by using the Quality Reporting
Document Architecture or QRDA for short.  The purpose of the QRDA is to develop a standard for healthcare information systems to communicate quality measurement data across disparate systems in a standardized fashion.  The QRDA supports the efficient collection, aggregation and reporting of quality measurement information for sharing among providers within a healthcare system or providers from different healthcare systems.  The architecture will support the exchange of quality data between providers and requestors of that information (e.g. QIOs, payers, accrediting orgs, etc.).

Read the rest of this post »

EHRs, Analytics, Utilization and Population Health

Spurred on by Meaningful Use, there has been an explosion in the implementation of EHRs over the last several years.  This tidal wave has been sweeping through the healthcare community, sucking up much of the available bandwidth that organizations have to deal with change of this magnitude.  The effect is really no different than what other industries have been through over the last couple of decades beginning with the emergence of ERP systems in the late ’80s, early ‘90s.  The organizations setting up EHRs have the opportunity to look back at the experiences those industries and to glean lessons learned.  One of the biggest is that there will be a second wave, which we are already starting to see.  This second wave is driven by the desire for information and knowledge.  Folks realize that the instillation of technology to support operating standards, policies and business procedures via EHRs provides for a great source of transactional data.  Data that is just waiting to be warehoused, given meaning, aggregated, sliced, diced and analyzed.  The challenge here, and a trap that many fall into, is that the data can seem so close at hand, accessible and, on a small scale, manipulatable, that the cost and effort to deploy analytics solutions to get at the data aren’t that great.   Invariably, after much investment and frustration at the inability to get all of the data, many realize that what they initially focused on was just the tip of the iceberg and that the effort of managing and distributing a large amount of information and knowledge across a large organization requires a great deal planning, time, people and investment.  While not quite as invasive as the rollout of the EHR, the investment in analytics is substantial, must be planned and executed over a period of time.

Avoid the Trap

There are a couple of tell-tale signs that you’ve fallen into the trap.  The first is the 80/20 rule, where you end up spending 80% of your time collecting, cleaning, organizing and making data available, leaving only a small amount of time to analyze and act upon it.  The second sign is the executive dashboard, the situation where a large number of people spend a great deal of time every month, sourcing from the new EHR and other transactional platforms, aggregating, calculating and making available, with very little automation, to a select few (ie., the senior management team).  A dashboard that others in the organization don’t have access to, nor, due to its highly aggregated level, is it of much value to, although I’m sure it’s been a source of many “fire-drills.”   The “fire-drill” being painful in that the lengthy and manual manner, in which the particular dashboard measure is deduced, must be dissected in order to determine was there really an issue or is it related to the calculation and aggregation process.  Then, if there is an issue, where?  Typically, you’re already 45-90 days out from the occurrence of the negative event.

It’s Not Just About the Transactional System

What can health organizations do about this?  First, they must realize that the implementation of the EHR creates both a great source of data and a need within the organization to aggregate that data, combining with other information from across the organization and from third parties.  With this awareness, the EHR effort should be shadowed by one focused on developing a strategy, objectives and plans supported by milestones to deploy analytics in a controlled and deliberate manner.  To successfully do so, it will be quickly realized that there are dependencies that must be addressed.  Such as the need for data governance, inclusion of any master data management activities already underway and the need for an infrastructure that enables the transactional, analytical and other systems and devices to access and exchange data, whether an HL7 transaction, X12 out-going batch file, an EHR feeding the analytics store or a patient portal via SOA.  Third are awareness, education and training.  Analytics unleashed upon the employee population all at once can be analogous to drinking from the fire-hose.  The effective use of analytics is driven by the ability of the organization, department, teams and individuals to clearly articulate a specific need for information, putting into the context of the particular business process(es), activity(ies) and task(s).  Ideally, analytics are doing two things for us; 1) reinforcing that we’re meeting or exceeding the desired performance level, as we all need that periodic feedback that everything is ok and 2) an exception is occurring, which is where we’ve defined what it is to be operating normally and an event or occurrence has arisen that is outside the box, so the appropriate people must be alerted and have the ability to drill down into abnormal event to immediately begin identification and resolution of the issue.

What Does It Mean to Me?

How does all of this relate to Utilization and Population Health?  Over the last few months, there has been a noticeable increase in activity amongst health systems around the desire to understand more about the dynamics of the marketplace they do business in and the population they serve.  They are more aggressively pursuing sources of information outside the organization that can be combined with internal information to begin to paint a picture of not only the morbidity of the local population they serve, but the usage patterns the population is following in seeking out care.  Seeking care isn’t as consumer-friendly as many would hope and most health coverage leaves the choice of access to the consumer.  Health systems can begin to identify and track those patterns of utilization, situations of network leakage, repeat visits, begin to stratify the local population for risk, predict demand on facilities and impact to case-mix.  To the extent the health system is pursuing community outreach and educational programs; this information can be input into designing these programs as well as way to measure their impact.  The outreach and education can occur in conjunction with the PCPs and, potentially, the health insurance companies servicing that same membership.  The unspoken objective of all is to better understand and improve on the outcome of care.

The ABCs of the CCD – Part I of III

CCD is an acronym that stands for “Continuity of Care Document”.  The CCD is a file that uses Extensible Markup Language (XML) format, which could have one of 3 different structure levels.  I will explain the various structure levels in Part III of this blog series.  A CCD contains patient related information that could be electronically exchanged between healthcare providers, as well as, shared with the patients themselves.

The CCD template, derived from the American Society for Testing and Materials (ASTM) Continuity of Care Record template or ASTM E2369-05 Standard Specification, or simply stated the CCR.  The CCD is constrained by the HL7 (Health Level Seven) Clinical Document Architecture (CDA).  The CDA adheres to the HL7 V3.0 Reference Information Model or RIM.

The ASTM CCR was created to provide a snapshot in time that contains a summary of relevant and pertinent encounters information (e.g., demographic, clinical, financial) of a patient.

Health Level Seven International partnered with ASTM to create an HL7 version of the CCR for institutions that preferred using the CDA model, hence the birth of the CCD.  The CCD maps the CCR elements into the CDA structure.

The CCD is a template based on the principles of the HL7 CDA.  The characteristics of a clinical document based on the CDA are the following:

  • Persistent
  • Authenticable
  • Human readable
  • Self-context
  • Thorough and complete
  • Stewarding

Although one of the characteristics of a CCD is to be human-readable this does not mean that there isn’t a tool involved for the readability.  A CCD could be rendered with a simple web browser in order to comply with the human-readability qualification.

The CCD is structured as a CDA document.  For those of you familiar with XML documents the following line-by-line depiction will be easily understood:

  • Document
    • Header
      • Body
        • Sections
          • Optional narrative block
            • Entries

A CCD includes the following 16 sections:

  1. Family history
  2. Social history
  3. Functional status
  4. Allergies
  5. Immunizations
  6. Medications
  7. Vital signs
  8. Medical equipment
  9. Support
  10. Encounters
  11. Problems
  12. Procedures
  13. Results
  14. Plan of care
  15. Payers
  16. Advance directives

Many Electronic Health Record (EHR) vendors are starting to implement the CCD to share patient information across Health Information Exchanges(HIEs), outpatient centers and other clinical providers.

The CCD is not alone.  There are many other CDA based templates:

  • Discharge summaries,
  • History and Physical (H&P)
  • Procedure Note
  • Progress Note
  • Operative Note
  • Consultation Note
  • Diagnostic Imaging Report

In the next Part II of this series we will explore a “real-world” example of a CCD.

Is it time for Open Source in Healthcare?

From time to time, it is a good idea to re-evaluate potential IT architectures especially with the cost reduction pressures in healthcare IT.  The growth in maturity of several key players in the open source software arena is gaining the attention and respect of healthcare IT decision-makers and worth evaluation as a lower cost alternative.  The ability to set-up a complete top to bottom architectural stack is getting very close to a reality in open source and the only challenge will be an organization that will be able to integrate that open source stack.  What are these maturing key components of an open source stack for healthcare?  Here are the candidates from the bottom of the stack to the top: Mirth for HL7 integration, Drools for a business rules engine, Mule for SOA, Pentaho for business intelligence, and Liferay for portal.

Mirth for HL7 message integration

Starting with the integration layer, Mirth Connect has developed into the cost effective alternative at a time when commercial HL7 integration engines are charging by the number of interfaces and driving up costs.  Mirth is specifically designed from the ground up for healthcare HL7 message integration and provides the necessary tools for developing, testing, deploying, and monitoring interfaces.  Mirth products address one of the most difficult problems in healthcare – interoperability. With some hospitals using nearly 200 applications each, the applications integration challenges can be complex and numerous.  Mirth is a comprehensive integration solution that can handle the work to transform and route healthcare data, and here is the real plus – because it is open source, a healthcare organization can share and reuse your interfaces with other organizations.

One key new offering from Mirth is Mirth Appliances.  The Mirth Appliance provides a ready-to-run healthcare messaging platform that is stable, secure, and scalable. With the need to create Health Information Exchanges at many larger healthcare organizations, the Mirth Appliance installed at each individual hospital or large physician practice can make interoperability more affordable with full commercial support and a simple management control panel. There are no per-interface fees or per-message charges on a Mirth Appliance to assist in keeping down IT costs.

Drools for a Business Rules Engine

To assist in transforming healthcare messages and support business rules is the open source product called “Drools.”  No, it isn’t about leaking saliva, but is a fast maturing business rules engine at a time when healthcare needs to separate business logic from reams of old legacy source code, especially to meet changing performance measurements or calculating metrics for Meaningful Use.  According to Wikipedia, Drools is a business rule management system (BRMS) with a forward chaining inference based rules engine, more correctly known as a production rule system, using an enhanced implementation of the Rete algorithm. The Drools business rules engine supports the JSR-94 standard for its business rule engine and enterprise framework for the construction, maintenance, and enforcement of business policies in an organization, application, or service. Drools is a key component for implementing a flexible Service Oriented Architecture (SOA). JBoss Rules is the commercial version of the open source Drools project.

The key improvement in maturity of this open source business rules engine is the addition of a business rules manager called Drools Guvnor which is a centralized repository for creating knowledge bases.  In addition, Drools Fusion provides for complex event processing including time-based decision-making, also a key need for healthcare environments.  One example of the use of a Drools business rules engine would be a clinical event monitoring system to provide proactive alerts when messages indicate a need for human intervention in a business process.

Mule for a Service Oriented Architecture

While all of the major commercial software vendors have SOA offerings, the cost to install, implement and maintain SOA environments is out of reach for a typical healthcare IT budget.  A fast growing community is implementing the open source SOA Enterprise Service Bus product called Mule from Mulesoft.  Mule is used for SOA by many of the Fortune 500 companies and is a real alternative for cash-strapped healthcare IT needing to move to a message-based architecture. Mule ESB is a lightweight Java-based enterprise service bus (ESB) and integration platform that allows developers to connect applications together quickly and easily, enabling them to exchange data. Mule ESB enables easy integration of existing systems, regardless of the different technologies that the applications use, including JMS, Web Services, JDBC, HTTP, and more. Combined with the Mirth Appliance, Mule is capable of supporting the HIE needs of a healthcare integrated delivery network. Mule is also a strong SOA alternative to the current state of hub and spoke, single point of failure, integration engines used in healthcare today.

Mule has excellent scalability for large healthcare organizations with complex enterprise integration needs. Mule’s stage event-driven architecture (SEDA) makes it highly scalable and a major airline processes over 10,000 business transactions per second with Mule while H&R Block uses 13,000 Mule servers to support their highly distributed SOA environment.  Clearly, this open source solution has matured and gained widespread acceptance, but Mulesoft isn’t resting on its success and is developing a SOA repository management solution for its Enterprise version.

Pentaho for Business Intelligence

With the very high level of interest in business intelligence and performance metrics in healthcare due to Meaningful Use and Accountable Care, the next open source stack component to review is Pentaho.  What is interesting about Pentaho is how they describe themselves: “Pentaho was born out of the desire to achieve positive, disruptive change in the business analytics market, dominated by bureaucratic mega vendors offering eye-wateringly expensive heavy-weight products built on outdated technology platforms, and who had become focused on integration with the rest of their enterprise application suites – at the expense of innovation of their BI capabilities.”  The maturity level of Pentaho is demonstrated by their recent inclusion as a strong vendor with the richest functionality and most extensive integration with Hadoop for big data by Forrester.

Pentaho as an open source solution has compelling capabilities for the healthcare environment.  One of the more important ones is that Pentaho provides the option to take data in-memory to speed up your analytics. For quick near real-time analytics for clinical decision-making, this feature is important. The challenge with other BI solution sets is that they may require a customer to bring all data in memory before analysis resulting in memory challenges on the hardware platform. Pentaho supports capabilities to manage in-memory analytics with very large data sets.

Pentaho is a BI solution that can also provide persistent caching for improving the speed of advanced analytics.  A typical healthcare organization will not want to load reference data and prime the cache every time the analytics server restarts. This may take a long time depending on the size of the data set, creating significant delays in making the system available to end users. Reference data and some master data loaded in memory or persistent cache can really speed queries in a healthcare setting, especially with by facility or physician views of detailed information.  Pentaho uses a distributive caching system to scale out and distributes the queries to a pool of shared memory for meeting concurrency requirements and avoiding cache bottlenecks.

Liferay for the Portal

Last, but definitely not least, is the open source portal software called Liferay.  Liferay has matured in recent years to become a complete portal solution that includes:

  •  • Content & Document Management with Microsoft Office® integration
  •  • Web Publishing and Shared Workspaces
  •  • Enterprise Collaboration
  •  • Social Networking and Mash-ups
  •  • Enterprise Portals and Identity Management

For healthcare, portals are the key information delivery mechanism including patient portals, physician portals, intranets and project management portals.  Liferay is an independent portal vendor that doesn’t insist on controlling the architectural stack of architectural components in order to work. For large healthcare organizations with thousands of employees, Liferay has strength in developing self-service portals that include knowledge sharing workspaces and Web 2.0 capabilities.

Consider Integrating the Stack

In summary, the big building blocks of an open source architecture are available today and are reaching a level of maturity that is worthy of consideration by large healthcare organizations.  It is important to note that open source software is not free and requires staff commitment to succeed, generally with good to excellent Java skills.  The commercial versions of the open source products are usually well documented and have excellent product support.

One key aspect of these open source products is that they are excellent for developing proof of concept (POC) projects before committing to purchasing the full commercial versions. This POC approach can allow for testing not only the functionality of these architectural components but evaluating them in your existing IT environments. If your IT team hasn’t taken a recent look at open source, you might be surprised at what big organizations are adopting it and how much money it could save your organization.

Facing and Overcoming the 2012 #HealthIT Challenges Amidst the End of the World – Part 2 of 2

In the first post of this 2 part blog we explored the big challenges with the demands that the ARRA HITECH and other compliance and regulatory impositions have impacted Healthcare IT: HIPAA’s Version 5010 conversion, ICD-10 migrations, Meaningful Use of EHRs and their Attestation and Accountable Care Organizations.  We also briefly touched the popular topic of the imminent end of the world in 2012 according to the Mayan calendar prediction.

If you read carefully you would have noticed that my predictions, well, they have some small glitches now and then, or you may call them “bugs” due to my software developer background.  So at the end of this blog we’ll have to revisit the end of the world prediction.  Sorry folks.

The ICD-10 deadline, as of last week, was announced by the HHS Secretary Kathleen G. Sebelius the intent to delay the compliance date.  Hopefully the delay will not be such that it has a big impact on healthcare interoperability projects.  ICD-10 would help the way healthcare data is stored and exchanged between systems.  One of the drawbacks of ICD-9 is that since it lacked codes to describe many diagnosis or procedures clinicians and related clerical staff would use the code that closely matched the reimbursement amount expected.  Pro-active healthcare organizations should move forward with their ICD-10 conversion projects since it solves many inherent problems contained in their data that hinders interoperability in a meaningful way.

Data Aggregation and Mining for Successful Quality Measurement Reporting and Performance Improvement Requirements

Going back to the topics we left off in our previous post, I would like to dive a little into data aggregation.  Healthcare data is contained in many source systems inside a hospital organization and more frequently it can be found outside of the organization.  I have been in several projects where I’ve had to aggregate data located in 3 different states in the US!

If your organization plans to successfully meet Meaningful Use stages 1, 2 and 3 then getting control of your data is of paramount importance.  Meaningful Use stage 1 may appear to be trivial to many organizations but don’t let this mislead you as to the growing complexities of stages 2 and 3.  Albeit we don’t know the details of the requirements for stage 2, which are to be announced shortly, what we do know is that they will require more data from the different source systems.

Health BI, as an aggregation platform, can receive healthcare data from myriad sources; whether it’s from the inpatient Health Information System (HIS) , the outpatient Electronic Medical Record (EMR) or the Laboratory Information System (LIS) it can all come together in a single repository from which up to 600 Clinical Quality Measures can be reported!  Health BI is modeled after the HL7 v3.0 RIM.

Read the rest of this post »

The Healthcare Domain Model: Where to start?

Let’s say you are tasked with modernizing an age old payer system with no documentation and it is business critical, with hundreds and thousands of lines of code: where would you start? Evaluation of technology stack, infrastructure, assessing the skill sets available, cost, etc. are all part of standard operating procedure, but we’ll save that for another day. The challenge is to rewrite an existing business critical healthcare application, or service-level agreement, with all the foundation necessary to make it a highly scalable system built on concepts of BPM, BRMS and SOA. One approach would be to start with the data model and then build the application model around it. Though this approach is typical, this would lead in to the classical impedance mismatch problem, compromising the domain integrity in very early stages of development. It would be almost impossible to back track from this though you are only halfway in to the development.

Starting with your domain model simplifies the approach for building a state of art system with a solid framework. For example, in a payer benefit system you would:

  1. Define your business entities like benefits, copay, coinsurance etc.
  2. Determine if there are similarities abstract the concepts in to more generic structures. Service would be generic structure representing ER and office visit
  3. Add attributes to the entities
  4. Define relationships between entities

Remember, you are only conceptualizing based on business entities and not letting the relational or object framework dictate your business model. Now you can build your object or data model based on conceptual model. Though this seems very straight forward and SOP it is not. One would need a strong domain SME who could articulate the domain as well as perform domain analysis of the business rules that have been built over decades. If you have a strong domain model to start with, technology or the architecture approach will not make you to go back to the drawing board. The concept of a copay will remain copay forever, irrespective of SOA or *OA, 5010, ICD-10 or HL7. Let’s get the basics right and be nimble …start with the domain!

What processes have you found to be successful? How have you improved your healthcare domain model? 

Could EMR software make healthcare worse?

This post is based off of an interview posted on the IBM Impact Blog.  Visit us in the Industry Zone at IZ-4 or check out our IBM Impact landing page

Widespread electronic medical record (EMR) adoption is hanging on hope that social and economic benefits will be received through the reduction of information silos in medical record data.  Unfortunately, without interoperability EMR adoption will only further strengthen the information silos that exist in today’s paper-based medical files.  This will result in even greater proprietary control over health information and, with it, control over patients themselves, which will greatly limit the innovative promise of future healthcare systems. Public efforts to support adoption of EMRs that are not interoperable are questionable, since stand-alone EMRs might not increase the healthcare consumer’s welfare.  In contrast, EMR adoption enhanced by interoperable information will increase consumers’ welfare through increased choice, portability, and control.

Let’s not think about interoperability only in today’s terms. Innovations in telehealth will place enormous demand on health information exchange and the benefits delivered from interoperability will be without question. Streaming real-time video interactions among physicians and between physicians and patients will be integrated into the EMR in some form and will require profound broadband capacity. Monitoring of live-feed data from the homes of the elderly or ill patient will tell clinicians and family members about medication use, mobility, food consumption, and other aspects of daily living. Wearable or implantable medical devices with wireless feeds will report patients’ physiological status to physicians. And that is not to mention inputs about weight and reaction times from automobiles, ambient environmental sampling data linked to one’s location by wearable global positioning system (GPS) devices or smart phones, exercise data from wearable heart monitors, insulin pumps, and sensors at one’s desk for other kinds of metrics. All of these data collection devices need to be tied together with bidirectional standardized messages.

The Future of Interoperability in Healthcare

The future of interoperability is to bind together a wide network of real-time, life-critical data that not only transform healthcare but become the new way of providing healthcare. Despite the benefits of interoperability, they may be difficult to realize. The reason for this is three fold:

  1. Interoperability benefits are highly dispersed across many stakeholders – doctors, patients, device manufacturers, and software developers to name a few.  Some parties could lose from disruption of long-standing industry practices, particularly vendors who rely on custom integration of their products for revenue and who use the lack of interoperability as a customer retention strategy.
  2. The cost of early adopters will penalize them and make it difficult for interoperability to establish momentum in broad geographies. Just like the fax machine, the last to install an interoperable EMR benefits from everyone else’s prior investment, and the first to install bears most of the cost.
  3. Interoperability technology heat seekers have faced many barriers and challenges that have resulted in partial success, slow progress, and outright failure. Adopting standards for interoperability could mitigate these costs and barriers. Interoperability may be beneficial and reduce costs, but it is certainly not easy.

Interoperability Must Precede EMR Adoption

The key question about interoperability is how it should proceed relative to EMR adoption. My point of view is typically that interoperability has to precede EMR use. It is based on the belief that the ability to share information has to be designed into EMRs and that the infrastructure and industry capacity for securely networking this information has to exist up front.  There is a very real risk that the widespread adoption of stand-alone EMRs without interoperability is simply a lost opportunity and one that may lead irreversibly to treatment of health information as a proprietary asset of delivery systems. If interoperability standards are not solidified and built into EMRs now, many large healthcare investments in EMR software will be wasted.

There are IT professionals that will argue that interoperability will follow after widespread EMR adoption. These individuals believe that once health information is electronic and everyone is using EMRs, interoperability will naturally follow, since it is easier and cheaper than manual data sharing. They view up-front requirements for interoperability as too restrictive and think that standards will naturally evolve from the point-of-care information infrastructure that the United States healthcare system is building.

Key Benefits from Interoperability Implementation

This “chicken and egg” debate needs to take place immediately while incentive dollars are being spent to get EMRs implemented especially in medium to large healthcare organizations.  There needs to be clear thinking on when interoperability standards are brought to adoption during the EMR implementation. Let’s not underestimate the true benefits of interoperability for EMR adoption. Here are three key benefits to be realized by interoperability and the first one is a key consideration for organizations starting or in the middle of EMR implementations:

  1. It makes your Electronic Medical Record (EMR) software easier to implement. One of the challenges of implementing large-scale EMR software (Cerner, EPIC) is that the healthcare organization has to continue to operate during the implementation. An interoperability solution can take information from your existing systems that produce HL7 or ANSI X12 transactions, laboratory systems, ancillary systems or newer medical devices and feed that data to the new application. As a result, the data entry or data collection process to start-up or parallel test the new EMR system can be reduced and, in some cases, eliminated. The cost and time savings will more than justify the cost of the interoperability solution.
  2. It can collect near real-time transaction information for improved operations. Simply saving the HL7 transactions being created and transmitted between your applications can provide valuable insights into daily operations. The near real-time capture of HL7 transactions will allow the creation of event notification systems to react to operational needs, patient safety or quality initiatives.
  3. An interoperability solution will integrate current healthcare applications and future software acquisitions. Most modern healthcare applications speak HL7 standard messages or they can be interfaced to create HL7 or ANSI X12 messages. This “common language” allows healthcare organizations to integrate the disparate applications in their IT environment including registration systems, lab systems, core measure tracking, surgical software suites, and newer medical devices. In addition, the common language of electronic transactions allows integration with external organizations – vendors, other healthcare organizations, providers and national level organizations.

If interoperability can gain center stage, clinicians could have the information they need at the point of care, consumers (patients) would have choice and portability, payers would save money, and researchers would have better data. Many healthcare organizations already own the toolsets or could pay for the software toolsets for interoperability from savings in productivity and reduced errors.  Let’s grab the golden ring of interoperability while we are on the merry-go-round of going electronic.

3 Reasons for using a Managed Private Cloud for Interoperability

Cloud computing is a popular topic in IT circles today, and with this year’s Interoperability Showcase at HIMSS, the cloud’s impact on Interoperability will be an interesting topic of discussion.  In healthcare circles, cloud computing conjures up fears for protecting private healthcare information and security concerns.  There is a business case for a special type of cloud computing for healthcare called a Managed Private Cloud.  A Managed Private Cloud could address the security concerns and deliver:

  1. cost reduction
  2. the  ability to scale, and
  3. better utilization of IT resources.

Cost reduction in healthcare organizations is clearly at the top of the list. The cost reductions derived from cloud computing aren’t new technologies, but a combination of existing technologies.  Virtualization drives higher utilization of resources and thus lowers capital expenses. Standardization also lowers capital and labor costs, and automation reduces the management costs. In addition, automation automates many of the manual tasks, especially related to system integration and interoperability and their associated costs.  The other key aspect of cloud computing is availability and stability both of which improve the end-user satisfaction and reduce lost productivity.

The concept of a Managed Private Cloud is different from the public cloud that many people understand from Amazon and Google.  A Managed Private Cloud would have the advantages of cloud computing but be owned by the enterprise (your healthcare organization), be capable of mission-critical applications, handle packaged applications and have high security compliance because it is controlled by your owners – your organization.  But there is one big difference – a Managed Private Cloud is typically third-party operated and removes the challenge of managing the virtual infrastructure from your organization.  This concept is a very different way of receiving and using compute application resources.  It is especially important when there are needs to scale up and down depending on compute needs.

Let’s examine a key application for the use of a Managed Private Cloud – system integration and interoperability.  Most healthcare organizations tackle this challenge by provisioning their own hardware, wrestling with operating systems, loading software and meeting the infrastructure management challenges on a day to day basis.  By contrast, a Managed Private Cloud maintains the hardware infrastructure, the operating systems, the storage management and infrastructure support activities.  This utility allows the customer to focus on the real task at hand – creating the connections between disparate systems and external partners. 

Scale-ability is an important feature of the Managed Private Cloud.  The Managed Private Cloud can scale up a development, testing or next version production environment as needed or on-demand.  More importantly, it can scale down as well.  As integrated EMR applications take over the silo’ed software systems that are currently integrated, then the managed infrastructure can be reduced along with their associated costs.   Time savings are significant from the Managed Private Cloud due to better utilization of resources to avoid waiting on the acquisition of new hardware or loading of operating systems (provisioning) just to get a project started.  This environment is ideal for organizations that prefer to prototype new applications then scale them up to production.

Moving away from the traditional approach of pulling hardware resources together and deploying them in support of a business function workload, essentially one project at a time actually contributes to the silos that make interoperability costly for many healthcare organizations.  More importantly, the Managed Private Cloud is designed for the 24x7x365 nature of healthcare, especially with a high degree of system integration.  This is a task that is very challenging for an internal IT team, especially in geographic areas where IT infrastructure skills are scarce or costly due to competition for talent.

A Managed Private Cloud is a better use of IT resources when a healthcare organization is considering a healthcare information exchange (HIE), particularly if the HIE involves multiple hospitals, private practices and laboratories.  Cloud technology has unique advantages to support this need: economies of scale, better resource utilization and the security of a private organizational ownership.  The connections to the various units of the organization from the Managed Private Cloud would be secure, private and, yet, always available.  The burden of the infrastructure provisioning and day-to-day management of the environments wouldn’t fall on the largest hospital or organizational unit.

In summary, the Managed Private Cloud is a prime time idea for healthcare and due to its design can address the security, compliance and other cloud concerns while delivering cost reductions, the ability to scale, and better utilization of IT resources.  Interoperability projects, especially HIEs, have a great affinity for the cloud computing model both technically and from a risk/reward perspective.  Just like good interoperability is about adopting standards, standardizing the provisioning for IT projects will lead to better economics.

If you would like to discuss this idea or other healthcare-related topics, please stop by Perficient’s booth (#3681) at HIMSS on Monday, February 21, 2011 from 1:30 p.m. to 3:00 p.m. or Tuesday, February 22, 2011 from  2:00 – 5:00 p.m.  See you there!

“Help! We are getting a late start on 4010 to 5010 migration . . .”

If your healthcare organization is getting a late start on 4010 to 5010 migration, you are not alone.  While you should be testing the new 5010 transaction sets with trading partners already, you may just now be getting budgetary funding to pursue a solution.  At Perficient, we understand the challenge of catching up to meet regulatory deadlines and we have created a solution using our partnership with Edifecs.  The approach is to create a “wrapper” around your inbound and outbound electronic transactions with what we call the Step Up/Step Down solution.  This solution will intercept the inbound 5010 transactions and step them down to 4010 for processing with your current healthcare IT applications.  When those same applications generate outbound 4010 transactions, then they are stepped up to 5010 for sending to your trading partners.

This approach has many advantages for organizations that are short on time to implement a full 5010 remediation effort or simply have too many competing IT projects, like implementing a new EMR or claims processing system.  One of the key advantages is the ability to buy your organization time to remediate internal IT systems in a phased or systematic fashion, especially if you have limited IT personnel or budgets.  Another advantage is that the Step Up/Step Down solution will accept either 4010 or 5010 transactions inbound or outbound.  This capability provides flexibility as your trading partners work to meet regulatory deadlines as well.  The third advantage is that this solution keeps all of the 4010 and 5010 transactions in a repository within the tool.  This data repository allows the creation of reports to make sure that billing and accounts receivable are correct and balanced despite the migration.  We believe this key advantage helps the CFO sleep at night.

If your organization would benefit from any of these advantages in your 4010 to 5010 migration, please attend this week’s Perficient webinar entitled “HIPAA 4010A1 to 5010 Migration: Rapid Compliance with a Step-up/Step-Down Approach” on Thursday, January 27, 2011 12:00 PM – 1:00 PM CST.  We plan to discuss the details of this solution, the pros/cons and provide a live demonstration of the solution.  Don’t miss it – it could save your organization stress, time and money.  Register at: www.perficient.com/webinars

Tags: , , ,

Posted in Interoperability

Is HIPAA 5010 Compliance making you feel like this guy?

Join us Thursday, January 27, 2011, 12:00 PM CT for HIPAA 4010A1 to 5010 Migration: Rapid Compliance with a Step-up/Step-Down Approach

Register at: www.perficient.com/webinars

As part of the final rule for HIPAA 5010 implementations, CMS has defined specific milestones that help healthcare organizations evaluate their progress towards compliance.  Many organizations have just now begun the work needed to achieve Level 1 compliance — the ability of an organization to test externally by January 1, 2011.  This entails an organization sending and receiving HIPAA electronic transactions without impacting payments to providers.

Healthcare organizations are looking for ways to achieve HIPAA 5010 compliance while controlling scope, cost and risk. A “Step-Up/Step-Down” strategy is a viable approach to achieving this. With a step approach, organizations accept a valid inbound 5010 transaction, map to their version of the 4010A1 (the “Step-Down”) and allow the current healthcare applications to process the transaction as they do today. For outbound transactions, the solution will take the file as processed by the core systems and map back to the 5010 version (the “Step-Up”), leveraging data from the original transaction.

Join Perficient as we discuss how a Step-Up/Step-Down implementation approach can help meet urgent migration needs:

  • The business case for Step-Up/Step-Down 5010 migration as a solution that is cost effective and allows application remediation for ICD-10 in controlled phases
  • Assess a market-leading EDI 5010 software product that supports Step-Up/Step-Down for rapid implementation
  • Justifying an investment with examples of how EDI 5010 can address other needs within the healthcare organization

Presenter Martin Sizemore is a Senior Solutions Architect of Healthcare Solutions for Perficient and has been a trusted advisor to CEOs, COOs, CIOs and senior managers for global multi-national companies and healthcare organizations.

Register at: www.perficient.com/webinars

Tags: , , ,

Posted in Interoperability, News

The Digital Nervous System for Healthcare Providers

Each of us has stubbed our toe and waited for the signal from the nerve in the toe to get to our brain for the final ouch.  That analogy helps us understand the nature of a near real-time system of messages to help us avoid pain in daily operations.  The typical healthcare provider has dozens of healthcare applications, outside vendors and even medical devices that produce HL7 messages.  Those HL7 messages are the key to developing a digital nervous system for healthcare providers and creating a real interoperability backbone between multiple clinical systems and the decision-making process.

The challenge is the development of the nervous system that carries these messages from the point of origination to all of the places needed by the whole organization.  Keeping with our analogy, we need a spinal cord to carry the messages to the brain for decisions.  In addition, there is a need in a nervous system for feedback loops when there is a critical message.  The brain has little to do with the decision to immediately draw back a hand touching a hot stove – it is a quick response.  In a healthcare environment, there are often messages that need that same type of immediate response to avoid an adverse reaction and protect patient safety.

Read the rest of this post »