Microsoft Enterprise Technologies Perficient is proud to be partnered with Microsoft Thu, 24 Jul 2014 12:15:37 +0000 en-US hourly 1 Copyright © Microsoft Community 2011 (Microsoft Enterprise Technologies) (Microsoft Enterprise Technologies) 1440 Microsoft Enterprise Technologies 144 144 Perficient is proud to be partnered with Microsoft Microsoft Enterprise Technologies Microsoft Enterprise Technologies no no Webinar: Ensuring a Successful SharePoint Migration to the Cloud Thu, 24 Jul 2014 12:15:37 +0000 Last week, at Microsoft’s Worldwide Partner Conference, it was mentioned that there are one billion Microsoft Windows and Office users today, and Office 365 is Microsoft’s fastest growing business in its history. At Perficient, we are definitely seeing this. Our customers are using Office 365 more than ever, from SharePoint to Exchange and Lync to Yammer and Power BI. webinar - internet education concept

That being said, when it comes to SharePoint, if you are considering moving your on-premises SharePoint environment to the cloud, you aren’t alone. There are a plethora of reasons to make the move – with Office 365, SharePoint is easy to manage, has enhanced security, and is accessible from just about anywhere. To add to that, you have OneDrive for Business, you can extend the collaborative nature of SharePoint with Yammer, and you can find tons of SharePoint apps in the Office Store  to extend functionality.

If your company is considering such a move, and you want to learn more, join us on Thursday, August 14, at 1 p.m. CT for a webinar, Best Practices for a Successful SharePoint Migration or Upgrade to the Cloud. During the session, Jason Bell, senior solution architect and SharePoint guru at Perficient, will show you how to make the migration process less daunting, including key details needed for a successful migration to Office 365. He’ll also cover:

  • Top reasons to move from on-premises to SharePoint Online
  • Challenges and technical considerations when migrating to the cloud
  • Options for migrating to Office 365/SharePoint Online
  • Best practices for secure cloud computing with SharePoint Online

In the meantime, our team has been hard at work on some comprehensive blog posts on the topic of migrations to Office 365. You can find them at our Microsoft blog.

To register for the webinar, click here.
Best Practices for a Successful SharePoint Migration or Upgrade to the Cloud
Thursday, August 14, 2014
1:00 p.m. CT

]]> 0
Partner Spotlight – AvePoint Online Services for Office 365 Tue, 22 Jul 2014 21:23:36 +0000 Perficient has many great partners that support our development and deployment of the best of breed solutions we provide for our clients. This post is the first in a series that will highlight some of the products available from our partners. Today, I’ll be presenting AvePoint and their Online Services for Office 365.

Requiring no installation and no agents, AvePoint Online Services is the industry’s first and only 100% Microsoft Azure-based Software-as-a-Service (SaaS) platform for Office 365. With simplified Office 365 administration, compliance, and governance, AvePoint Online Services empowers organizations to extend cloud computing as dictated by specific business needs.

There are 3 main components to the solution:

Office 365 Management

avepoint logoOffering granular content protection for SharePoint Online, OneDrive for Business, and Exchange Online as well as comprehensive configuration and audit reporting, DocAve Online provides administrators enhanced management and control of Office 365 users, permissions, and content.

Manage. Protect. Report.

  • Enhance management with DocAve Online’s unmatched controls for Office 365 security, configurations, and content.
  • Ensure data loss protection with quick, granular restore and easily reorganize content to accommodate enterprise scaling.
  • Optimize platform performance with real-time publication and synchronization. Gain valuable insights with customizable reports, allowing for comprehensive understanding of SharePoint Online environments.

More information

Office 365 Compliance

AvePoint Compliance Guardian Online helps compliance and security officers minimize risk of privacy violations, malicious threats, or unintended data leakage for web applications, websites, and cloud platforms.

Privacy. Accessibility. Risk.

  • Enforce compliance and ensure privacy of your organizations’ web applications and cloud initiatives with an effective risk management lifecycle.
  • Proactively protect your cloud environments from information leaks, contamination, or misuse by implementing and automating the enforcement of governance polices.
  • Ensure compliance with Compliance Guardian Online monitoring and incident tracking.
  • Mitigate risk across your cloud information gateways to maintain content and system privacy, security, quality, and accessibility.

More information

Office 365 Data Governance

DocAve Governance Automation Online enables data governance by providing your SharePoint Online users with a business-centric, role-based service catalog to request configurable services.

Provisioning. Lifecycle Management. Security.

  • Enable proactive data governance with DocAve Governance Automation.
  • Provide your SharePoint Online users with a business-centric, role-based service catalog to request configurable services.
  • Maintain control of SharePoint sites, ensuring all actions or changes to Office 365 – SharePoint Online environments fall within your organization’s defined governance policies.

More information

The Avenue to Microsoft’s Cloud: All Microsoft, All the Way

  • One platform. Unified services for information workers, decision makers, and IT administrators
  • 100% Microsoft Azure based Software-as-a-Service (SaaS) platform for Office 365
  • Requires no installation. Requires no agents.

More information



For more information, feel free to contact AvePoint directly. Or contact us here at Perficient and we can help you with your Office 365 implementation.


]]> 0
Partner Spotlight – Metalogix Diagnostic Manager Tue, 22 Jul 2014 21:21:22 +0000 Perficient has many great partners that support our development and deployment of the best of breed solutions we provide for our clients. This post is the second in a series that will highlight some of the products available from our partners. Today, I’ll be presenting Metalogix and their Diagnostic Manager for SharePoint. 91583_Metalogix_Logo2011

Metalogix Diagnostic Manager ensures SharePoint performance and availability at all times. It monitors SharePoint content and servers from a single console, quickly identifying , diagnosing, and resolving problems before users even notice. It also provides critical planning information by storing size, count, and performance data of both servers and stored objects. Diagnostic Manager also monitors SharePoint in real-time, enabling administrators to quickly pinpoint performance and availability issues with servers, resources, HTML controls, web parts, and web controls.

Key Features

  • Preventative Care, Not Just Emergency Care

Diagnostic Manager provides constant insight to your server and content performance. It continuously monitors, diagnoses, discovers, analyzes and resolves SharePoint performance and availability issues.

  • Diagnose Without Delay

Access historical data, alerts, performance information, and more, even when SharePoint is down. Metalogix Diagnostic Manager eliminates time lost waiting for more information and let’s administrators start work as soon as a problem occurs. It can also send alerts via email to keep administrators aware of problems, even offsite.

  • Take control of your SharePoint health

Continuously monitor, diagnose, discover, analyze and fix SharePoint performance and availability issues efficiently. Quickly understand why your SharePoint servers are not performing well or why pages are taking long to load.

  • Out-Of-The-Box Monitoring

Instant and automatic monitoring of all SharePoint farm servers without the need to develop any kind of script. Built-in defaults can start monitoring your SharePoint servers immediately.

metalogix1 Highlights and Benefits

  • Continuous SharePoint monitoring: 24×7 SharePoint farm monitoring. Performance data is captured for key SharePoint pages and servers and stored in a database for historical analysis.
  • One console. Many farms: Multiple SharePoint farms monitored from a single console with complete server and configuration details such as CPU, disk activity, IIS, memory and page utilization.
  • SQL Server monitoring: Detailed configuration options, database fragmentation and capacity planning data
  • Full web page coverage: Granular, server-side performance data analysis for all SharePoint publishing and web part pages such as server controls, web parts, images, and scripts
  • Unlimited pages: No restriction on the number of pages monitored.
  • Top server and top pages reporting: Metalogix Diagnostic Manager enables you to quickly identify the worst performing pages (Top Pages) or servers (Top Servers)
  • Farm Content Summary: Farm content reporting includes counts, sizes, and utilization of objects.
  • Capacity planning data: Rich , graphical analyses of patterns and trends.
  • Easy-to-use Interface: Powerful and easy to use console enables complex analyses to be displayed and understood in seconds, without the need to be a SharePoint expert.
  • Dashboard views: Metalogix Intuitive dashboard UI provides a quick view of SharePoint server health, along with diagnostic and resolution information.


Comprehensive Reporting

Metalogix Diagnostic Manager provides reporting at the summary, server and page levels to provide rapid status on the SharePoint environment. Summary and detail performance metrics provide a comprehensive view at the server, content, and administrative levels.

Complete SQL Server Monitoring

Diagnostic Manager provides full SQL Server insight, including details on database configuration, fragmentation, and allocation. It also provides critical capacity planning using data and log file growth trends.

Clean Data For True Testing

Clear server-side performance metrics for accurate test data and diagnosis, without interference from browser, geo, OS, or connection.

Collects Component-level results

Powerful collection of page control elements that clearly differentiates page loading, execution, or rendering of individual page elements. Identify the worst performing HTML controls, web parts and web controls by load time, size and type.

Get a view into the data stored into SharePoint

Diagnostic Manager’s farm content captures objects size, count and growth information, enabling capacity planning and effective forecasting of trouble-spots.

Threshold Customization

Customized server level thresholds such as CPU, Memory, Disk, IIS, page HTML, web parts and web control alerting to meet unique performance requirements.

Granular Alert Controls

Set up rules to control the response to alerts. Deploy alert emails only to those with responsibility for particular farms, servers or performance metrics or apply business notification criteria.

SNMP Support

Diagnostic Manager’s ability to generate SNMP traps in response to alerts allows integration with Enterprise monitoring solutions such as Microsoft SCOM. For more information, contact Metalogix directly or contact us here at Perficient and we can help you with your SharePoint and Metalogix deployments.

]]> 0
Readmissions Analysis using Epic’s Cogito and Microsoft Tools Tue, 22 Jul 2014 14:04:37 +0000 One of the myriad of new requirements tucked inside the Affordable Care Act is for healthcare service providers to implement strategies to reduce the number of inpatient readmissions, which in many cases are deemed to be costly and indicative of poor quality of care.

One way to drive such a reduction strategy is to enable analysts and providers with business intelligence tools that put various readmissions metrics at their fingertips.  Additional value is garnered when those metrics can be filtered, sliced, diced and compared against a number of useful dimensional attributes.  Developing and automating such tools helps business users avoid having to write monotonous queries, piece together disparate data from various sources, and manually compile things like month end readmission rates.

To accomplish this goal at a recent client engagement, as a member of a larger Perficient consulting team, I chose to build a Microsoft SSAS Tabular Model, a new feature of SQL Server 2012, paired with Power View to enable a self-service BI visualization layer.  Additionally, this particular client chose to leverage the fairly new Epic Cogito Data Warehouse (CDW), and thus the semantic and visualization layers were built on top of that existing data model.

A tabular model was chosen in lieu of an OLAP cube for a few reasons.  The engine that runs tabular models is columnar based and fully in-memory.  In short this means queries execute extremely fast.  Additionally, tabular models tend to be simpler and faster to develop than cubes, which is good for future maintenance and extensibility.  Finally, tabular models offer the bulk of features expected from a multidimensional data source.

Early client conversations were organized into 3 main topics:

  • Measures:  what are all the various types of calculations needed regarding readmissions
  • Dimensional attributes:  what are all the ways the client would like to filter, slice, and aggregate these calculations
  • Security:  who are the various audiences that will access this data, and what should their level of granularity be, especially regarding HIPAA compliance, patient and provider level detail, etc

Once I felt confident with the requirements I’d gathered, I started reverse engineering the Cogito DW to figure out what fact and dim tables I would need to leverage.

Inside the tabular model, I decided to go with two customized fact tables:  one at the encounter/admission grain, and one at the readmission grain.  This second fact table involved self-joining encounters back onto themselves on patient id, and building some date logic to include only those patients readmitted within 30 days.  From there, I developed around 40 various DAX calculations that performed rollups in different ways.  Examples include various permutations of:  admission counts, discharge counts, distinct patient counts, all cause readmissions, cause readmissions, readmission rates, non-readmitting discharges, and readmission percent of total.

For dimensional attributes, many existed as part of Cogito DW natively, and it was just a matter of trimming records down to inpatients and newborns only.  Examples of such dimensions include:  date, department, patient, provider, coverage, primary diagnosis, drg, billing account service profile, and admission profile.

Some dimensional attributes were trickier, however.  Some were not part of native CDW, and therefore had to be added as extension tables first in the data warehouse.  Some, such as all discharging diagnosis, had many-to-many relationships to fact records, and therefore had to be massaged into comma-delimited lists that became 1-to-many related.  Finally some attributes incorporated extensive business logic, for example Unit, which was based on an AdmissionDischargeTransfer fact table not part of native CDW.

The final security implementation could likely be an entire blog article on its own, but in short, the strategy involved:  slightly customizing fact tables for different audiences, creating limited attribute dimension tables, varying relationships per audience, limiting certain calculations, and leveraging perspectives for an overall clean user experience.

The final readmissions tabular model was leveraged via SharePoint in multiple ways:  Power View dashboards we developed along with the “self-service” capability for business users to create their own dashboards, SSRS integrated reports, and direct model browsing in Excel with classic pivot tables.  The platform was very positively received by our client, and I performed a series of trainings to enable members of their internal IT to build tabular models on their own.



Finally, here’s a screenshot of the final tabular model, resembling a general star-schema:

Readmissions Tabular Model 2014-05-16








]]> 0
Kapow to Sitecore Migration: Part 2 Thu, 17 Jul 2014 15:08:11 +0000 In my previous Kapow migration post, I gave an overview of the tool. In this post, I’ll give a short technical explanation of the migration process I developed. Keep in mind that my upload target was Sitecore, so some of my setup was Sitecore-specific.

First, an inventory of all the current pages in the site must be done. For our site, these pages were grouped according to Sitecore template and the URL of each page was loaded into spreadsheets. So I had 8 spreadsheets with names like “FAQ”, “Video”, and “LandingPage”, correlating with Sitecore templates named similarly. My spreadsheets had the following layout:

  • First column: Page URL of the current site, which Kapow reads to load the page I wanted to extract from, since Kapow crawls HTML to extract data
  • Second column: the new site’s URL so I knew where to load the finished data
  • Third column: the name of the left menu associated with the new page so that I could associate the correct menu with each page

Second comes the data extraction. The first step here is setting up the data structure. In my opinion, this is very much like designing database tables for any project. Recurring elements are grouped into their own tables. The tables in Kapow are called types. This is one of my types, a recurring element that had an image, text and a URL:


A couple of things that I think are key to setting up useable types:

  • Certain common fields are indispensable. All my types had a SourceURL (the old page), and a TargetURL (the new page). This helped me trace back to the old page when I was verifying data or troubleshooting.
  • Another common field was a MigrationStatus. This helped me track where I was in the process – extraction, transformation, or upload.
  • It helps to plan ahead. I added the ItemGuid to this type even though it had nothing to do with migration until the very last step. This field contained the unique Sitecore identifier for a piece of content that wasn’t populated until the clean data was loaded into Sitecore in the upload step. This proved invaluable in troubleshooting.
  • For any field that needs to be cleaned up, I included two fields – the original field and the transformed field. (Note the ChicletText and ChicletTextTransformed fields, above.) This allowed me to check a transformed field against its source. It also allowed me to rerun just my transformation process, since I still had the original field from the extraction.

One difficult issue will be familiar to anyone who has designed a database – which data to break out into separate tables and how to link them. Because Kapow doesn’t give direct control over SQL Server updates, I found this trickier than usual. Once again, an example:

Many different types of pages had image banners at the top. It made sense to have a PageBanner type. But I needed a way to link that PageBanner record back to its parent page. Kapow does have the concept of a foreign key, but because of the complexity of our data, I opted to use the SourceURL to link parent and children. This worked well and I would do it that way again. Kapow also provides an Iterator as a built-in variable type for a loop command, so if there were multiple children that had to be placed in order (think of a slideshow, for instance), using this Iterator to sequence the data worked well.

Once the data structure is defined, the robot-making process can begin. A robot is a series of actions chained together. At its simplest, it looks like this:

  • Do a Load page action. Read the URL from the spreadsheet and load the associated page into Kapow’s built-in browser.
  • Perform an extract action. Determine what you need to extract. If it’s a single field, like text on a page, right-click it and load it directly into the type you’ve defined. If you need to loop through data (like FAQs, e.g.), you can use one of Kapow’s looping constructs. In all cases, you need to determine how Kapow will find your data reliably – whether it’s by a named div, a unique CSS class, or position in a table.
  • Do a Store in Database action. Kapow uses the type to create a table if necessary and store a record in the database

Run the robot and Kapow loops through all the rows in the spreadsheet, extracting data and storing it in the database. The above example is simplistic, of course. Kapow has many actions, from assigning a variable to performing a test to storing in a database. Here is a partial list:


For anyone familiar with the basics of programming, configuring an action is really a matter of figuring out Kapow’s method of using a given construct. Following is an example of an if/else action, which won’t seem foreign at all to a developer:


The third step is Transformation – cleanup of the data. There are really two parts to this process: figuring out what transformations need to occur and then implementing those transformations.

Even the most careful analysis does not always uncover every transformation that must occur. For example, we know about the following 2 path transformations:

/sites/oldsite/BannerImageRotatorImageLibrary to /newsite/Images/Banners

/sites/oldsite/FileLibrary to /newsite/Files

We build our transformations. Now our analyst comes along and apologetically explains that one more transformation has been discovered in some dark corner of the site. Sigh. We have to go back and change all of our Transformation robots to include the new cleanup item. But…there is a solution to this problem. It’s called a snippet. This is a set of steps that can be set up and reused throughout other robots. If all of my path transformations are included in a snippet, adding one more to the snippet updates those transformations throughout the site. The snippet is highlighted in the image below.


Kapow makes the process of transformation easier through the use of regular expressions. My personal favorite, however, is the Data Converter. This allows chaining of commands, passing in the output of one command as the input of the next command. A simple example follows:


When we’re finished, we have clean extracted data in database tables. The only thing left to do is upload it to our new Sitecore site. Stay tuned.

]]> 0
Perficient wins Microsoft Partner Awards in all 3 US Regions! Wed, 16 Jul 2014 20:34:40 +0000 The Perficient team is out in force in Washington DC this week attending Microsoft’s Worldwide Partner Conference (WPC14).  At the conference the team was honored to receive Microsoft Partner of the Year awards in every Microsoft US region. This was a big honor, building on last year’s US Partner of the Year award and our US Healthcare Provider Partner of the Year awards. Thank you Microsoft, we appreciate the partnership and value the recognition!  From the news release

Member's of the Perficient team getting ready to accept Partner of the Year awards in every Microsoft US region

Member’s of the Perficient team getting ready to accept Partner of the Year awards in every Microsoft US region

For the second year in a row, Perficient was named both the East Region NSI Partner of the Year and the Central Region Enterprise Office 365 Partner of the Year. Additionally, the company was declared the West Region Compete Partner of the Year. These awards highlight Perficient’s capabilities in and successful implementations of Microsoft technology solutions including cloud computing technologies like Office 365, Microsoft Azure, Lync Online, Yammer, SharePoint Online, InTune and Dynamics CRM.

“Microsoft’s enterprise offerings have grown increasingly cloud-based as companies move to adopt this innovative, efficient and secure technology,” said Mike Gersten, vice president of Perficient’s Microsoft national business group. “Cloud computing lowers operating costs and provides agility and scalability options unavailable on limited legacy infrastructure. We are honored to receive these three Partner awards, which which reflect the strength of Perficient’s Microsoft cloud consultation and delivery expertise at work across the country.”

Perficient has helped clients across multiple industries implement cloud solutions like Office 365 and Microsoft Azure. The company has activated more than one million Office 365 seats, which is more than any other National Systems Integrator.

Highlights of recent Microsoft implementations include:

  • Working with a multinational firm to create a custom MVC application utilizing many Azure components, including SQL Azure, Web Roles, Worker Roles, and BLOB Storage. The solution replaces previous spreadsheet-style reporting with dashboards and data visualization, and is used to identify potential hazards and recognize exemplary employees.
  • Partnering with a leading transportation operator to plan and develop a global cloud-based employee portal solution utilizing Office 365 and, specifically, SharePoint Online. With a responsive design and support of multiple devices, the portal offers users improved search capabilities and better ease of use.
  • Collaborating with a large health plan provider to supply an integrated digital experience solution leveraging Sitecore and the Microsoft Server Stack. Basing all of the client’s sites on the same core set of components and a single framework, Perficient delivered a common user experience, independent of device, to all.

Through its partnerships with leading technology innovators, Perficient provides clients in all industries business-driven technology solutions and support in a wide range of practice areas. Perficient’s Microsoft consultants specialize in several practice areas including unified communications, social collaboration, business intelligence and cloud computing to provide digital marketing, portals, mobile and customer relationship management solutions to many of the most complex organizations in the country.

Microsoft’s WPC14 continues through Thursday.

]]> 0
Kapow to Sitecore Migration: Part 1 Wed, 16 Jul 2014 13:04:17 +0000 In my many years of writing Web Content Management sites, a number of clients have discussed migrating content from an old site into a new site via some kind of automatic migration, but always ended up doing a manual migration. This past spring, we finally had a client who decided to use Kapow as the migration tool to move content from their current Sharepoint site into their new Sitecore site.

Kapow to Sitecore Migration: Part 1In Part 1, I’ll give an overview of Kapow by asking and answering questions about its use. In Parts 2 and 3, we’ll dip into more technical topics.

What is Kapow?

Kapow is a migration/integration tool that can extract data from many different sources, transform that data, and move it to a new platform. In my case, I extracted data from a Sharepoint site, adjusted link and image paths, and inserted the transformed data into our Sitecore system.

When should I consider using Kapow?

As a rule of thumb, you should consider using Kapow when you have more than 10,000 pages to migrate. However, this decision is ultimately up to the client. Costs of the software and the setup of the migration process have to be weighed against the time involved in a manual migration and the extended migration period and content freeze involved in a manual process. I should also note that Kapow isn’t necessarily just for one-time migrations. It can also be used on an ongoing basis whenever there are multiple disparate data sources. A good example of this is a monthly report with data that must be gathered from several different sources.

How is Kapow installed?

An msi is downloaded from Kapow’s site and installed.   Although Kapow comes with a development database (an Apache Derby based database), we were using SQL Server, so that had to be configured. At this point, the Management Console service is started. This checks your license and allows access to Kapow’s suite of tools. Overall, a very easy install.

How do I use Kapow?

The answer to this question is that it depends what your needs are. Kapow has an extensive suite of tools. My needs on this project were limited, so I used only the Design Studio tool, and occasionally the Management Console. Design Studio is used to develop, debug, and run robots, which extract and transform content. It has a powerful interface, a little reminiscent of Visual Studio.


Robots can also be uploaded to the Management Console and run from there. This provides scheduling and automation capabilities. Other tools in the suite give further automation and flexibility (RoboServer) and allow user input into processes (Kapplets).

What are the steps in a site-to-site migration?

Inventory. Determine which pages will be migrated. Classify them by type. One migration robot is written for each type of page. In our case, we loaded the URLs of each page type into different spreadsheets

Extraction. Kapow works by crawling the HTML of each page. It reads the URL into a browser and provides extraction tools to parse through the data and save it to a database table. An example is helpful here. I have a plain FAQ page. I can use Kapow’s looping mechanism to iterate the questions and answers, saving each pair into a database table. Locating the questions and answers within the page can be done by finding some constant in the HTML, e.g. a CSS class associated with the question/answer pair.

Transformation. Data from an old site cannot usually be imported into a new site without some cleanup. Link and image paths are obvious examples of content that will most likely need to be changed. Again, a simple example will help. For this site, videos needed to be changed from their old .f4v format to their new .mp4 format. So one step in my transformation process looked like this:


The first step is a loop that extracts URLs. I then tested the URL for the .f4v extension, transformed it to the .mp4 extension and replaced the new URL in the page text. The cleaned-up data is placed back into the database.

Upload. This step takes the data and loads it into the new site. My target site was Sitecore, so I was advised to use the Sitecore Item Web API to upload the data. If my target site had been extremely simple, this might have worked, but because it was a fairly complex site, this Web API didn’t come close to answering the need. See Part 3, coming shortly, for my solution.

Did you find any “gotchas” in using Kapow?

Overall, I found the tools I used to be more than enough for getting the job done. I don’t think I scratched the surface in what could be done. I did find a bit of occasional flakiness in the extraction process. Especially when extracting images and files or complex pages, I’d find that Kapow “missed” some extractions, even after I had run the extraction multiple times to take advantage of caching. I worked with Kapow Support on this and received good advice, but never achieved 100% correct extraction. That’s why I’m adding one more step to the previous question:

Verification. Check the target site to be sure extraction, transformation, and uploading were done correctly.



]]> 0
Transforming the Patient Experience with Epic, BI, and PressGaney Tue, 15 Jul 2014 20:40:58 +0000 During my project over the last 6 months, I have spent my time developing two BI Solutions for ProHealth Care in Waukesha, WI. ProHealth Care is a health care organization that is using an Epic Cogito data warehouse on a Microsoft SQL server 2012 database. Over the last year we had an aggressive project schedule to bring this warehouse online and integrate reporting within a Microsoft SharePoint 2013 BI center.  This would be the 2nd phase of this year-long project. On deck was the task of improving workflow through the Patient Experience program. The Patient Experience program is an internal program common to any healthcare system that focuses on patient satisfaction and quality. Surveys are a key component of the data gathering processes many healthcare institutions use to manage patient experience.

Title Text (1)Press Ganey is a survey company that provides survey results and statistics for ~ 10,000 healthcare providers according to their website. By providing surveys and data services to a large number of organizations, they are able to compile a national database of questions, answers, and statistics of patient satisfaction. This data is used to evaluate any participating organization on their current performance with a percentile ranking against every other facility. The results from the surveys are an industry benchmark against which many organizations measure their level of service.

Needless to say, it is critical information for any health care organization that wants to improve their overall quality and performance levels.   Our goal was to automate, integrate, and to distribute two data feeds from Press Ganey via SharePoint BI Center, keeping a self-service model as a design goal, all within in a 6 month time period. No problem.

The problem: The problem ProHealth Care was having could probably best be summed up as “data overload”. A lot of hours were being spent every month downloading reports from a website and then creating a multitude of excel worksheets to do the number crunching in order to calculate high level performance metrics.

Another problem was that the data was being stored in an unstructured format. There was no simplistic way one could relate the data results from the top level “score card” numbers back to the source data. Who was the doctor, where and what time did this patient visit, what was their diagnosis-all questions any person in charge of making sure patients are happy with their visit would be interested in.

Press Ganey offers its clients two different types of data feeds

1. A standard XML export that they provide free of charge. The XML data export provides a wealth of information including patient sentiment results and actual patient comments. It also includes the questions that were on each patient survey, and of course the numeric (1-5) answers as is Press Ganey’ s format.

2. What is missing from the XML export are the Ranking, Sample Size, and Mean scores that Press Ganey provides in their online reports. For this data, you will need to use the custom data feed service. It’s exactly as its name implies, a data export service where the client can define what data and what time periods they are interested in receiving.

We chose two different approaches to deal with the two different Press Ganey data feeds.

The XML Export. For this solution we decided to import directly into the existing data warehouse. Cogito does provide a basic module for survey data, but it was necessary to extend it and enhance it quite significantly to meet our needs. Several new dimensions were created, and one new fact table was as well, but the basic Cogito model provided a starting point handling the basics of the Survey and Survey question entities.

We developed two cubes from the end data models. SSAS was chosen for this solution so that we could aggregate totals, do AVG’s and SUM’s easily, and provided for future growth of the data. I believe the scalability of an OLAP cube outweighed a tabular model for this case as it has the potential to eventually contain a lot of data.

So on to the second solution-the Custom Data Feed. This is a paid service that Press Ganey provides to their customers. Pro Health is interested in knowing how they are performing on a national level and how they rank versus every other hospital providing the same type of services. In order to see this, they receive RANK, Mean, and Sample Size scores from Press Ganey every month. This information was being provided via an HTML reporting application, but we needed an automated integration to solve the problem of needing to manually download and process this data every month.

This solution ended up being a lot easier to build in terms of the source data, it is a simple CSV file containing exactly what you need. Nothing too fancy in terms of transforming the data, basically just scoop it up and bring it into a data model. This data model did not have any equivalent in the Cogito product, but we were able to leverage some standard dimensions such as DateDim, SurveyDim, QuestionDim, etc. The data part of this solution was nothing new to a decent ETL developer, the interesting part of this solution was the Power Pivot and Power View model that was rendered in Microsoft’s SharePoint BI Center.

The Power Pivot/Power View combo proved very effective for this particular set of data. We created one Power Pivot model and Several Power Pivot views of this data. No more digging through directories filled with excel worksheet snapshots from all of the past months and years. A few scorecards, trend lines, and bar charts that could be filtered replaced many of them. It is now quite easy to filter on any time period the end user wants and then to see the results.

Power View is a wonderful tool. It seems like every person you hand it to will create something different, even when using the same set of data. It really does allow you to use your imagination a bit and be creative with an underlying data set, while still remaining user friendly to someone that is not a programmer. Its learning curve is not steep at all for someone familiar with basic pivot table/Excel charts functionality.

Both solutions ended up being distributed in Microsoft’s BI center on SharePoint 2013. SharePoint provides a great portal and management interface to provide access to the SSAS cubes, the Power Pivot and Power View objects, and an easy way to share the information throughout the organization. Developing the BI Center was sort of the side project that ran parallel to several lines of work in this phase of the project. ProHealth Care’s web team did a fantastic job of getting the BI Center up and running and developed. Don’t underestimate the SharePoint developer involvement that is necessary to get all of the content secured, distributed, and presented in a usable manner.

One of the major difficulties we encountered in this project was simply just agreeing on which data we would receive in the custom data feed. This wasn’t so much a technical problem, but just a matter of discussing, agreeing (or disagreeing), and of course budgeting for what we were going to buy.

The feed itself worked great thanks to some very helpful and competent admins at ProHealth Care and Press Ganey (Hi!), but it would have been nice to maybe use a web service, an Azure market place feed, or something a little bit more advanced than just an FTP inbox/outbox type architecture. I will look into this in the future for any other Press Ganey integrations.

The possibilities from the XML data from press Ganey are many-especially if you have the back end data warehouse to tie this information to. The XML file is a bit difficult to deal with as there is no XSD associated with it, but Press Ganey does provide an import script example that will get you most of the way there. Be ready for a few SQL XML shredders and some XQuery! Not the easiest file to work with, but it’s what’s inside that counts. As a free export, it can add a tremendous value to any healthcare organization that wants to drill down and dig into their Press Ganey Data.

One major potential “gotcha” with the XML file, is that without a data warehouse to relate this file to, you would have to do a lot of extra work building dimensions. In this case, the export from Press Ganey contained foreign keys to Cogito, this was a very important link between the XML export and Cogito. This allowed us to relate the survey data to just about anything that was in the data warehouse. Without this link the integration would have been a lot more difficult.

In summary, I was very happy with the end result. I feel going forward that ProHealth Care now has a powerful tool to help them manage and investigate their Patient Experience program. The end product consisting of two SSAS cubes, and a Power Pivot/ Power View model, will provide a lot of information to the end users via the Microsoft Self Service BI model. Any organization that is using Press Ganey survey information would benefit from a project like this.


]]> 0
Surprise! Microsoft Future Is Dependent on Data Fri, 11 Jul 2014 20:40:35 +0000 In his July 10th email to employees, Microsoft CEO Satya Nadella mentions the word “data” no fewer than 15 times.   This simple fact serves to highlight how dealing with data is a foundational part of Microsoft’s future strategy.   When he describes a “mobile-first and cloud-first world”, Mr. Nadella is describing a world where data is ubiquitous in “the background of our lives”.   He wants to position Microsoft at the twin apexes of both producing and consuming all that data.

Surprise! Microsoft Future is Dependent on DataThe keystone to that strategy is Microsoft’s hyper-scale public cloud platform, Azure. Azure is positioned to serve as a cloud data storage hub, offering NoSQL style BLOB storage as well as traditional relational storage with Azure SQL Database.   The HDInsight service leverages Azure BLOB storage to offer a Big Data option in the form of a full-blown Hadoop installation in the cloud. And virtualized SQL Servers can also be spun up for purposes including cloud-based BI and analytics.

Beyond even the cloud, the newly re-branded Microsoft Analytics Platform System is combination of SQL Server PDW (Parallel Data Warehouse) appliance with a local installation of HDInsight. Microsoft’s breakthrough Polybase technology allows integration between the two, allowing SQL users to query Big Data directly. And of course SQL Server 2014 joins the In-Memory database market, and still provides traditional SQL Server value and power in the on-premises market.

So, that sums up the Producing side. But what about Consuming?

Working from a position of some strength — and frankly also trying to ignore a traditional weakness — Microsoft has ordained that Excel is really the ultimate front-end for their BI platform. Power BI is the branding for this collection of services, and the so-called “Power Tools” themselves (Power Pivot, Power View, and Power Query) are the baseline components, available as plugins for desktop Excel and natively in Office 365 Excel.

Office 365 is truly the focus of most of the evolution of the BI delivery platform right now. In addition to the 3 basic Power Tools mentioned above, Office 365 also provides the geospatial analytics tool Power Map (currently also available in Preview for desktop users). And the coup de grâce comes in the form of Power BI Sites — an app available for SharePoint Online that provides collaboration, mobile, and natural language query functionality to the table.

All of these options combine to form Microsoft’s platform for pervasive data. As this strategy matures, I think we can expect to see tools merge and even go away to be replaced by others. But the fact remains that Microsoft is positioning their data platform to serve both cloud and on-premises, to be scalable, and to support goal, to “reinvent productivity to empower every person and every organization on the planet to do more and achieve more.”

]]> 0
Microsoft, the productivity & platform company for a mobile world Thu, 10 Jul 2014 20:08:20 +0000 This morning, Microsoft CEO Satya Nadella sent his employees an email, and a pretty important one at that (Read it here). July marks the beginning of FY15 for Microsoft, and it’s a time to reflect on the previous year and plan for the future. For Nadella, this means determining where the focus lies as the company forges ahead in an industry deeply rooted in innovation.

Nadella is the third Microsoft CEO, leading the company since February, and has been with the company for over two decades. It’s safe to say, he’s seen a lot change at Microsoft during that time, and has held a variety of roles and positions – most recently, as the executive vice president of Microsoft’s Cloud and Enterprise group.

With his background, running the division responsible for the technology powering Microsoft’s cloud-centric services, and the fact that Microsoft has led with the cloud for several years now, the choice to make Nadella chief executive fits right in with Microsoft’s transition. This cloud focus has been evident in recent partnerships with Oracle and Salesforce, both of which serve as a way to grow Azure, Microsoft’s cloud hosting platform, by providing popular application choices to use with Azure.

In today’s email, Nadella wrote:

We live in a mobile-first and cloud-first world. Computing is ubiquitous and experiences span devices and exhibit ambient intelligence. Billions of sensors, screens and devices – in conference rooms, living rooms, cities, cars, phones, PCs – are forming a vast network and streams of data that simply disappear into the background of our lives. This computing power will digitize nearly everything around us and will derive insights from all of the data being generated by interactions among people and between people and machines.Microsoft core

He goes on to describe how the many devices, combined with cloud services, create a unique opportunity for Microsoft. And Microsoft’s passion? Well, it’s to allow people to thrive in our mobile-first, cloud-first world. And, while officially announced today, the company’s been heading in a more multi platform supportive direction for some time now (even releasing Office for iPad a few months ago).

It seems Microsoft is saying what so many have been looking for – the device doesn’t matter.

So, if device doesn’t matter, how can “devices and services” be Microsoft’s mantra? That’s changing too. From Microsoft’s inception, over the years, what’s been a constant is how Microsoft products and services help us get things accomplished – all of us. Whether you are talking about a high school student, a grandmother, or maybe a small mom and pop store, a non-profit organization, a large global company with complex business processes… Microsoft’s there, behind the scenes, enabling us to be more productive.  Per Nadella:

At our core, Microsoft is the productivity and platform company for the mobile-first and cloud-first world. We will reinvent productivity to empower every person and every organization on the planet to do more and achieve more.

When it comes to digital work and life experiences in a mobile-first, cloud-first world, Nadella brought up the notion of ambient intelligence. This includes Delve, the first experience powered by Office Graph which was just announced yesterday, and Cortana, both of which are able to ask questions and have those questions answered with Power Q&A, an aspect of Power BI, self-service analytics available through Office 365. He went on to describe how apps will be reinvented in this new world, and the shift from Office as a product to Office as a cloud-based service.

Apps will be designed as dual use with the intelligence to partition data between work and life and with the respect for each person’s privacy choices. All of these apps will be explicitly engineered so anybody can find, try and then buy them in friction-free ways. They will be built for other ecosystems so as people move from device to device, so will their content and the richness of their services – it’s one way we keep people, not devices, at the center. This transformation is well underway as we moved Office from the desktop to a service with Office 365 and our solutions from individual productivity to group productivity tools – both to the delight of our customers. We’ll push forward and evolve the world-class productivity, collaboration and business process tools people know and love today, including Skype, OneDrive, OneNote, Outlook, Word, Excel, PowerPoint, Bing and Dynamics.

As a Microsoft partner, we are excited about the opportunity that’s ahead of Microsoft, and look forward to being part of that opportunity, helping our own customers with their own transition to a mobile-first and cloud-first world.

Note: I highly recommend reading the email in full from Satya Nadella, as there’s a lot of great information, some of which I did not touch on.

]]> 0