Microsoft

Blog Categories

Subscribe to RSS feed

Archives

Kapow to Sitecore Migration: Part 2

In my previous Kapow migration post, I gave an overview of the tool. In this post, I’ll give a short technical explanation of the migration process I developed. Keep in mind that my upload target was Sitecore, so some of my setup was Sitecore-specific.

First, an inventory of all the current pages in the site must be done. For our site, these pages were grouped according to Sitecore template and the URL of each page was loaded into spreadsheets. So I had 8 spreadsheets with names like “FAQ”, “Video”, and “LandingPage”, correlating with Sitecore templates named similarly. My spreadsheets had the following layout:

Read the rest of this post »

Perficient wins Microsoft Partner Awards in all 3 US Regions!

The Perficient team is out in force in Washington DC this week attending Microsoft’s Worldwide Partner Conference (WPC14).  At the conference the team was honored to receive Microsoft Partner of the Year awards in every Microsoft US region. This was a big honor, building on last year’s US Partner of the Year award and our US Healthcare Provider Partner of the Year awards. Thank you Microsoft, we appreciate the partnership and value the recognition!  From the news release

Member's of the Perficient team getting ready to accept Partner of the Year awards in every Microsoft US region

Member’s of the Perficient team getting ready to accept Partner of the Year awards in every Microsoft US region

For the second year in a row, Perficient was named both the East Region NSI Partner of the Year and the Central Region Enterprise Office 365 Partner of the Year. Additionally, the company was declared the West Region Compete Partner of the Year. These awards highlight Perficient’s capabilities in and successful implementations of Microsoft technology solutions including cloud computing technologies like Office 365, Microsoft Azure, Lync Online, Yammer, SharePoint Online, InTune and Dynamics CRM.

“Microsoft’s enterprise offerings have grown increasingly cloud-based as companies move to adopt this innovative, efficient and secure technology,” said Mike Gersten, vice president of Perficient’s Microsoft national business group. “Cloud computing lowers operating costs and provides agility and scalability options unavailable on limited legacy infrastructure. We are honored to receive these three Partner awards, which which reflect the strength of Perficient’s Microsoft cloud consultation and delivery expertise at work across the country.”

Perficient has helped clients across multiple industries implement cloud solutions like Office 365 and Microsoft Azure. The company has activated more than one million Office 365 seats, which is more than any other National Systems Integrator.

Highlights of recent Microsoft implementations include:

  • Working with a multinational firm to create a custom MVC application utilizing many Azure components, including SQL Azure, Web Roles, Worker Roles, and BLOB Storage. The solution replaces previous spreadsheet-style reporting with dashboards and data visualization, and is used to identify potential hazards and recognize exemplary employees.
  • Partnering with a leading transportation operator to plan and develop a global cloud-based employee portal solution utilizing Office 365 and, specifically, SharePoint Online. With a responsive design and support of multiple devices, the portal offers users improved search capabilities and better ease of use.
  • Collaborating with a large health plan provider to supply an integrated digital experience solution leveraging Sitecore and the Microsoft Server Stack. Basing all of the client’s sites on the same core set of components and a single framework, Perficient delivered a common user experience, independent of device, to all.

Through its partnerships with leading technology innovators, Perficient provides clients in all industries business-driven technology solutions and support in a wide range of practice areas. Perficient’s Microsoft consultants specialize in several practice areas including unified communications, social collaboration, business intelligence and cloud computing to provide digital marketing, portals, mobile and customer relationship management solutions to many of the most complex organizations in the country.

Microsoft’s WPC14 continues through Thursday.

Kapow to Sitecore Migration: Part 1

In my many years of writing Web Content Management sites, a number of clients have discussed migrating content from an old site into a new site via some kind of automatic migration, but always ended up doing a manual migration. This past spring, we finally had a client who decided to use Kapow as the migration tool to move content from their current Sharepoint site into their new Sitecore site.

Kapow to Sitecore Migration: Part 1In Part 1, I’ll give an overview of Kapow by asking and answering questions about its use. In Parts 2 and 3, we’ll dip into more technical topics.

What is Kapow?

Kapow is a migration/integration tool that can extract data from many different sources, transform that data, and move it to a new platform. In my case, I extracted data from a Sharepoint site, adjusted link and image paths, and inserted the transformed data into our Sitecore system.

Read the rest of this post »

Transforming the Patient Experience with Epic, BI, and PressGaney

During my project over the last 6 months, I have spent my time developing two BI Solutions for ProHealth Care in Waukesha, WI. ProHealth Care is a health care organization that is using an Epic Cogito data warehouse on a Microsoft SQL server 2012 database. Over the last year we had an aggressive project schedule to bring this warehouse online and integrate reporting within a Microsoft SharePoint 2013 BI center.  This would be the 2nd phase of this year-long project. On deck was the task of improving workflow through the Patient Experience program. The Patient Experience program is an internal program common to any healthcare system that focuses on patient satisfaction and quality. Surveys are a key component of the data gathering processes many healthcare institutions use to manage patient experience.

Title Text (1)Press Ganey is a survey company that provides survey results and statistics for ~ 10,000 healthcare providers according to their website. By providing surveys and data services to a large number of organizations, they are able to compile a national database of questions, answers, and statistics of patient satisfaction. This data is used to evaluate any participating organization on their current performance with a percentile ranking against every other facility. The results from the surveys are an industry benchmark against which many organizations measure their level of service.

Needless to say, it is critical information for any health care organization that wants to improve their overall quality and performance levels.   Our goal was to automate, integrate, and to distribute two data feeds from Press Ganey via SharePoint BI Center, keeping a self-service model as a design goal, all within in a 6 month time period. No problem.

The problem: The problem ProHealth Care was having could probably best be summed up as “data overload”. A lot of hours were being spent every month downloading reports from a website and then creating a multitude of excel worksheets to do the number crunching in order to calculate high level performance metrics.

Another problem was that the data was being stored in an unstructured format. There was no simplistic way one could relate the data results from the top level “score card” numbers back to the source data. Who was the doctor, where and what time did this patient visit, what was their diagnosis-all questions any person in charge of making sure patients are happy with their visit would be interested in.

Read the rest of this post »

Posted in News and Events

Surprise! Microsoft Future Is Dependent on Data

In his July 10th email to employees, Microsoft CEO Satya Nadella mentions the word “data” no fewer than 15 times.   This simple fact serves to highlight how dealing with data is a foundational part of Microsoft’s future strategy.   When he describes a “mobile-first and cloud-first world”, Mr. Nadella is describing a world where data is ubiquitous in “the background of our lives”.   He wants to position Microsoft at the twin apexes of both producing and consuming all that data.

Surprise! Microsoft Future is Dependent on DataThe keystone to that strategy is Microsoft’s hyper-scale public cloud platform, Azure. Azure is positioned to serve as a cloud data storage hub, offering NoSQL style BLOB storage as well as traditional relational storage with Azure SQL Database.   The HDInsight service leverages Azure BLOB storage to offer a Big Data option in the form of a full-blown Hadoop installation in the cloud. And virtualized SQL Servers can also be spun up for purposes including cloud-based BI and analytics.

Beyond even the cloud, the newly re-branded Microsoft Analytics Platform System is combination of SQL Server PDW (Parallel Data Warehouse) appliance with a local installation of HDInsight. Microsoft’s breakthrough Polybase technology allows integration between the two, allowing SQL users to query Big Data directly. And of course SQL Server 2014 joins the In-Memory database market, and still provides traditional SQL Server value and power in the on-premises market.

So, that sums up the Producing side. But what about Consuming?

Working from a position of some strength — and frankly also trying to ignore a traditional weakness — Microsoft has ordained that Excel is really the ultimate front-end for their BI platform. Power BI is the branding for this collection of services, and the so-called “Power Tools” themselves (Power Pivot, Power View, and Power Query) are the baseline components, available as plugins for desktop Excel and natively in Office 365 Excel.

Office 365 is truly the focus of most of the evolution of the BI delivery platform right now. In addition to the 3 basic Power Tools mentioned above, Office 365 also provides the geospatial analytics tool Power Map (currently also available in Preview for desktop users). And the coup de grâce comes in the form of Power BI Sites — an app available for SharePoint Online that provides collaboration, mobile, and natural language query functionality to the table.

All of these options combine to form Microsoft’s platform for pervasive data. As this strategy matures, I think we can expect to see tools merge and even go away to be replaced by others. But the fact remains that Microsoft is positioning their data platform to serve both cloud and on-premises, to be scalable, and to support goal, to “reinvent productivity to empower every person and every organization on the planet to do more and achieve more.”

Microsoft, the productivity & platform company for a mobile world

This morning, Microsoft CEO Satya Nadella sent his employees an email, and a pretty important one at that (Read it here). July marks the beginning of FY15 for Microsoft, and it’s a time to reflect on the previous year and plan for the future. For Nadella, this means determining where the focus lies as the company forges ahead in an industry deeply rooted in innovation.

Nadella is the third Microsoft CEO, leading the company since February, and has been with the company for over two decades. It’s safe to say, he’s seen a lot change at Microsoft during that time, and has held a variety of roles and positions – most recently, as the executive vice president of Microsoft’s Cloud and Enterprise group.

With his background, running the division responsible for the technology powering Microsoft’s cloud-centric services, and the fact that Microsoft has led with the cloud for several years now, the choice to make Nadella chief executive fits right in with Microsoft’s transition. This cloud focus has been evident in recent partnerships with Oracle and Salesforce, both of which serve as a way to grow Azure, Microsoft’s cloud hosting platform, by providing popular application choices to use with Azure.

In today’s email, Nadella wrote:

We live in a mobile-first and cloud-first world. Computing is ubiquitous and experiences span devices and exhibit ambient intelligence. Billions of sensors, screens and devices – in conference rooms, living rooms, cities, cars, phones, PCs – are forming a vast network and streams of data that simply disappear into the background of our lives. This computing power will digitize nearly everything around us and will derive insights from all of the data being generated by interactions among people and between people and machines.Microsoft core

He goes on to describe how the many devices, combined with cloud services, create a unique opportunity for Microsoft. And Microsoft’s passion? Well, it’s to allow people to thrive in our mobile-first, cloud-first world. And, while officially announced today, the company’s been heading in a more multi platform supportive direction for some time now (even releasing Office for iPad a few months ago).

It seems Microsoft is saying what so many have been looking for – the device doesn’t matter. Read the rest of this post »

Released – Official name for Oslo and New Office 365 SMB Plans

Oslo Renamed to Delve

Announced earlier this year as Codename Oslo, today Microsoft published the official name as Delve. Delve, the first experience powered by the intelligence fabric we call the Office Graph, will be available to Office 365 customers later this year.

Released - Official Name for Oslo and New Office 365 SMB PlansNew SMB Plans

Microsoft will release as of October 1st, 2014 three new Office 365 plans tailored to meet the needs of small and midsized businesses (SMBs), ranging from 1 to approximately 250 employees. The new plans are:

  • Office 365 Business – The full Office applications – Outlook, Word, Excel, PowerPoint, OneNote and Publisher, with 1TB of OneDrive for Business cloud storage to access, edit and share your documents across your Windows PC, Mac, iPad, Windows tablet and smartphone.
  • Office 365 Business Essentials - The core cloud services for running your business – business class email and calendaring, Office Online, online meetings, IM, video conferencing, cloud storage and file sharing and much more.
  • Office 365 Business Premium – Get everything from both the Office 365 Business and Business Essentials plans.

This new lineup will replace our current plans for SMBs over time – Small Business, Small Business Premium and Midsize Business (more on that here).

 

 

 

 

Virtualizing SharePoint 2013 Workloads

Most new SharePoint 2013 implementations these days run on virtual machines, and the question on whether to virtualize SQL servers has been long put to rest. Indeed, with the new Windows Server 2012 R2 Hyper-V VM specs of up to 64 vCPUs, 1 TB RAM and 64 TB data, it is  hard to make a case for physical hardware.

Both Microsoft Hyper-V and VMware have published recommendations for working with virtualized SharePoint farms. The list of recommendations is long (and somewhat tedious), so this cheat-sheet aims to summarize the most important ones and provide real-world advice for SharePoint and virtualization architects.

  • When virtualizing SharePoint 2013, Microsoft recommends minimum of 4 and maximum of 8 CPU cores per VM. Start low (4) and scale up  as needed. With multiprocessor virtual machines, the physical host needs to ensure enough physical CPU cores are available before scheduling threads execution of that particular VM. Therefore, in theory the higher the number of vCPUs, the longer potential wait times for that VM. In every version starting 4.0, VMware has made improvements to the CPU scheduling algorithm to reduce the wait time for multiprocessor VMs using relaxed co-scheduling. Still, it’s wise to consult documentation on your particular version and see what are the specific limitations and recommendations.

 

  • Ensure true high availability by using affinity rules.  Your SharePoint admin should tell you which VM hosts which role, and you will need to keep VMs with same role on separate physical hosts.  For example, all VMs that host the web role should not end up on the same physical host, so your typical mid-size 2 tier farm should look something like this:

VMAffinity

  • When powering down the farm, start with the web layer, and work your way down to the database layer. When powering up, go in the opposite direction

 

  • Do not over oversubscribe or thin-provision PROD machines, do oversubscribe and thin-provision DEV and TEST workloads

 

  • NUMA (non-uniform memory access) partition boundaries: The high-level recommendation from both Microsoft and VMware is not to cross NUMA boundaries. Different chip manufacturers have different definitions of NUMA, but the majority opinion seems to be that NUMA node equals physical CPU socket, and not CPU core. For example, for a physical host with 8 quad-code CPUs and 256 GB of RAM, a NUMA partition is 32 GB. Ensure that individual SharePoint VMs will fit into a single partition i.e. will not be assigned more than 32 GB or RAM each.

 

  • Do not use dynamic memory: Certain SharePoint components like search and distributed cache use memory-cached objects extensively and are unable to dynamically resize their cache when the available memory changes. Therefore, dynamic memory mechanisms like minimum/maximum RAM, shares, ballooning driver etc. will not work well with SharePoint 2013. Again, your SharePoint admin should provide detailed design and advise which VM hosts which particular service.

 

  • Do not save VM state at shutdown or use snapshots in PROD: SharePoint is transactional application and saving VM state can lead to inconsistent topology after the VM comes back up or is reverted to a previous snapshot.

 

  • Disable time synchronization between the host and the VM: Same as previous point. All transaction events are time stamped, and latency during time synchronization can cause inconsistent topology. SharePoint VMs will use the domain synchronization mechanism to keep local clocks in sync.

 

  • Do not configure “always start machine automatically”: There may be cases where SharePoint VM is shut down for a reason, and starting it automatically after physical host reboot can cause problems.

 

  • TCP Chimney offload: Please refer to this VMware post on reasons why this setting may need to be disabled. This is not a setting unique to SharePoint and unless it is the standard practice for all web VMs or is part of the image, it should not be configured.

 

  • When configuring disaster recovery, virtualization has been a godsend for quite some time. Using VM replication to a secondary site is by far the simplest SharePoint DR scenario to configure and maintain.

 

  • Other settings that are not SharePoint-specific : things like storage host multi-pathing, storage partition alignment, physical NIC teaming, configuring shared storage for vMotion etc. hold true for all VMware implementations

 

 

Columnstore Indexes: When Should You Use Them?

When I speak to clients about In-Memory features in SQL Server, I find that Columnstore indexes just haven’t gained much traction as a marquee feature. The functionality itself is quite useful in the BI/DW realm as far as potentially boosting query performance by 1.5 to 10 times. But I think it gets overlooked because the use-case just isn’t very obviously derived from the typical description of it. The explanations I have read/heard/seen of Columnstore and how it works get tedious very quickly.

Columnstore Indexes: When Should You Use Them?So I don’t want to cover details of how Columnstore works in this post. I just want to clarify when it might be useful. Then, if it sounds like it fits your situation, you can dive into some links and have all the tedium you want.

So here are, to me, the most pertinent Columnstore facts to be aware of:

  • It stores data in a columnar data format, heavily compressed, and in-memory — so it’s FAST.
  • It is very focused on large result sets that need to be aggregated or grouped. If you are doing full table scans in your queries, you might be interested.
  • It requires partitioning. If you have a large Fact table that is a candidate for partitioning, this again is potentially right up your alley.
  • Columnstore is not ideal for frequently updated tables. You will end up having to drop and re-create the index before/after data update operations. So a rapid incremental refresh environment is not an ideal fit.  UPDATE: I am reminded by a very helpful colleague that SQL Server 2014 removes this limitation and allows table updates/deletes/etc.  (Thanks Andrew!)
  • Because it is an In-Memory feature, your capability and performance is dependent upon hardware and SQL Server memory configuration.

If you have large fact tables and query performance issues, and if SSAS is either not an option or itself has performance issues, columnstore is an option to investigate. Columnstore indexes have been shown to be faster than an Analysis Services cube in some instances!   From my perspective, a couple of use case scenarios immediately come to mind:

  • Creation of specific fact structures for highly responsive reports/dashboards — especially in situations where Analysis Services is not an option, or is also not performing adequately
  • Improving cube processing performance (although the drop/rebuild time for the columnstore index will then likely take place during ETL — so net performance gain would have to be tested)

For further info, this article seems to be the granddaddy of all columnstore articles. It contains a massive and detailed FAQ, and includes the formula for determining memory capacity.   More focused instructions and examples of creating and using a columnstore can be found here, on Microsoft’s TechNet site. Cheers!

How to Speed up a Slow People Picker in SharePoint

Manjeet Singh, Lead Technical Consultant at Perficient, recently wrote a blog post about issues relating to a slow People Picker in SharePoint.

Have you experienced problems with People Picker taking too long to find a user? Almost a minute or may be more. Does your SharePoint Environment functions with multiple domains?
One of the reasons for sluggish behavior of people picker is the People Picker property called “SearchActiveDirectoryDomains” which usually scopes the entire AD with its sub trees and trusted AD’s while searching for the users account.

In his post, Manjeet details the step-by-step process to fix this issue. You can read the entire blog post here.