Perficient Healtchare Solutions Blog


Mike Lynch

Posts by this author: RSS

Give me agility or give me death!

Today’s enterprise application ecosystems have evolved over an extended period of time, often resulting in fragmented, disjointed application portfolios and systems. IT is responsible for maintenance of existing business applications, functionality and infrastructure that supports current daily operations while at the same time must address the evolution of future business needs. In order to meet timelines and business initiatives quickly while managing resource contention and financial constraints, solutions may be cobbled together. Delivery managers are under great pressure to deliver on business needs or face becoming irrelevant.

As the velocity of change increases IT organizations must embrace change. To understand this change all one needs to look at is Facebook, Google and Apple. Social media, data and mobile applications are changing the way we develop and maintain relationships in our personal lives and is now driving the way businesses deliver products, services and information. New collaborative ecosystems that drive agility are the norm in modern consumer applications. Applications seamlessly connect to other applications to provide new applications. For example, take Google Maps, a standalone user application that also provides an interface allowing other developers to leverage the platform to create new location-based applications. Applications are built upon other applications creating a web of applications called ‘mashups.’

Key enablers of mashups include infrastructure commoditization, service standardization, service reuse and outsourcing. All of this allows for the efficiencies of scale to be realized and ultimately exploited. At the foundation are cloud infrastructure providers like Amazon and Rackspace. When Amazon first introduced its cloud platform, the Elastic Compute Cloud (EC2), in August of 2006, I immediately realized this was a game changer. While this aspect of the new technology paradigm is not as consumer recognizable as Android or the iPhone, it is one of the key technology enablers of today’s app centric world.

Cloud services enable small innovative startups access to world-class infrastructure and platform scalability without having to invest in infrastructure. If you have an idea you can build it, deploy it and scale it on demand only paying for what you use. If you have spikes in usage, say, during business hours, you only pay for the resources you actually use. In a typical datacenter maximizing hardware utilization is a difficult challenge. Infrastructure must be in place to handle peak loads but sits underutilized during non-peak hours consuming datacenter power and space. In addition, if you own it you have to maintain it, manage it and eventually replace it at end of life. With cloud service providers you no longer need to maintain a physical data center.

Expectations of agility, flexibility and speed of delivery is driven by how pervasive this new technology has become in our personal lives. These new platforms are fun and cool but most importantly they are productive. We use these new app driven devices in our personal lives and eventually business will demand IT to deliver to the same standards. For example take the RIM Blackberry and Apple iPhone. Five years ago when the first iPhone was rolled out IT was hesitant to support the iPhone and the Blackberry was king.

Similarly look at the new iPad and Android tablets. Microsoft rules the desktop but how much longer will the desktop be the prevailing platform? On a side note, I find it interesting how we have spent the better part of the last 20 years moving off the mainframe and now we are moving back to the client. The backend architectures are not the same but paradigms are similar.

Ultimately, IT must evolve and embrace this new reality because at the end of the day business survival likely is very dependent on delivering technology services to clients, service providers and partners. If in house IT cannot enable business innovation, business will look to external partners that effectively enable business agility.

Protecting patient data

As our healthcare systems become increasing connected and interdependent, protecting the privacy and integrity of patient data is critical. As health information exchanges (HIE), regional HIEs (RHIE) and health information service providers (HISP) become more prevalent, the importance of following best practices implementing security for the exchanging of data with external partners should be a key objective. Public Key Infrastructure, or PKI, is the technology used to ensure that healthcare data is protected while being transported between partners over the Internet.

Following are the three functions provided by PKI encryption/decryption services for the secure exchange of health care data:

  1. Encryption/decryption of the document being exchanged – This is the encryption of actual data being exchanged. This could include a Word document, txt note, HL7 message or JPG.
  2. Encryption/decryption of the transport – The transport is essentially the pipe that the data is sent over. This is the layer that would contain login information, cookies and other environment information that would otherwise be exposed if the transport were not secure.
  3. Verification that data received was not altered in transit – The sender of the document will generate what is called a ‘hash’ that will be included within the encrypted document that provides verification that the document was not altered.

PKI is the technology that provides the security foundation for internet-based services such as online banking and shopping. Prior to the adoption of PKI, most EDI transactions were sent through a value added network (VAN). In the “old days” before the internet, there were dial up connections where partners would exchange X12 EDI data with each other through a VAN intermediary. As the internet came of age and security technology and practices evolved, the VANs faded away much like the vinyl record and partners began connecting directly over the internet.

All of this is based around the PKI “trust” model. If Alice wants to exchange electronic documents with Bob, both parties need to establish their identities with a third party called a certificate authority, or CA. This third party CA establishes that both Alice and Bob are who they claim to be. In addition, Alice and Bob trust the CA so that by abstraction, they trust each other. Now when Alice sends a document to Bob, she also includes the credentials of her CA along with her credentials. Bob’s system will verify the validity of both Alice’s and the CA’s credentials to determine that the document was in fact sent by Alice.

When Alice sends a document to Bob, she will first generate a “hash” that is mathematically linked to the document being sent. The document and hash are encrypted and sent via an encrypted transport to Bob over the internet. When Bob receives the document and hash he will decrypt the document then evaluate the hash and verify that it matches the document received. If he hash calculated by Bob and the one provided with the document sent by Alice match then the document has not been altered.

To deliver the encrypted document from Alice to Bob there are many different mechanisms that could be used. For instance, the encrypted file could be sent via secure FTP, secure HTP or over a VPN connection. The important aspect here is that the encrypted data, when transmitted over the internet, must be sent over an encrypted transport. Each of the transports referenced above provide for encryption of the data transport. This protects any secure information that the sender may provide that is required by the receiver for authentication and authorization of access to the remote system such as password, user names and IDs.

The three key PKI elements required for enabling secure healthcare data exchange over the internet are described above and should provide a basic understanding of the core principles around PKI. While this is a high level description of encryption/decryption, a basic understanding of these components is important. If any one of these three components is omitted the security and integrity of the healthcare data being protected could be jeopardized. It can be very easy in today’s fast paced technology-driven environment to be lulled into a false sense of security by implementing a technical solution that fails to address the complete scope of the necessary requirements. At a minimum, the risks of excluding a necessary component should be understood so that the appropriate business decision can be made.

Interoperability Infrastructures: Enabling healthcare quality measures

There is no doubt that the healthcare industry is going through a tremendous paradigm shift in an attempt to get a grasp on rising costs and inefficiencies. From ACOs, pay for performance, Meaningful Use, quality measures and patient access to electronic health records the fundamental enabler to deliver on this vision is an effective interoperability infrastructure.

One thing that is certain is the demand for sharing data across enterprises, partners, government agencies and patients will increase. Significant effort has been made in recent years by labs and pharmacies to integrate with EMRs ahead of meaningful use incentives and state health information exchange (HIE) implementations. Meaningful use, government regulation, reporting, the shift to outcome based payment, implementation of new EMRs, managing costs as well as mergers and acquisitions will place a much higher demand on IS departments to provide greater interoperability capabilities to both internal and external customers.

The Path to Success

Achieving success in realizing actionable informatics and making a significant difference in managing costs will in part be driven by the ability of an organization to implement a robust, agile interoperability infrastructure. When assessing all the aspects of implementing a comprehensive infrastructure it can be an overwhelming. While the destination should be well understood from the outset, the journey should be more forgiving providing for the necessary growth of the organization to evolve its understanding of the challenges of implementing enterprise interoperability solutions.

For organizations implementing enterprise level interoperability solutions it is beneficial to take the evolution versus revolution approach. From the outset, a clear long-term strategic direction should be understood and defined. The path towards achieving the strategic goal should be tracked closely allowing for necessary adjustments over time. While the strategic direction is significant, it is the short-term tactical projects that will provide value today by addressing current enterprise needs for providing services and system capabilities to internal and external users.