Microsoft

Blog Categories

Subscribe to RSS feed

Archives

Follow our Microsoft Technologies board on Pinterest

Webinar Recap: Migrating to SharePoint Online with @jasonbell

There’s a lot of interest around moving to the cloud, and specifically, SharePoint Online. Because of that, we’ve had several webinars over the summer that focus on SharePoint Online and SharePoint in a hybrid environment (you can view all our past Microsoft webinars here, beginning with the most recent). headache2

Despite that interest, migrations can be a bit of a headache (or in some cases, a debilitating migraine). But, if you do your research and plan properly, the process can be a fairly smooth one – possibly even your last, since once in the cloud, you shouldn’t need to do intensive upgrades or migrations in the future.

Last week, we held another session around SharePoint Online, this time focusing on Best Practices for a Successful SharePoint Migration or Upgrade to the Cloud.  My colleague, Jason Bell, a senior solution architect within our Microsoft practice, kicked off the webinar with the top reasons to move to SharePoint Online. Following this, he shared migration methodology, which includes your migration assessment, migration development, and the actual migration plan.

Next, Jason talked about the different migration approaches – manual, scripted, or the use of a third party tool like AvePoint, Metalogix, or Sharegate. He wrapped up with a discussion around secure cloud computing, including information rights management and the use of Office Web Apps.

For the full webinar replay, including an informative Q&A portion where Jason answered a ton of attendee questions, click here. You can also catch up with Jason on Twitter @jasonbell.

 

How to develop and deploy for SharePoint Online (Office 365)

With the rapidly evolving migration to the cloud SharePoint teams are faced with a new challenge: How do we develop and deploy for SharePoint Online?

If your feet have been firmly planted with on-premises development for SharePoint it can be a little daunting trying to move your process to the cloud. Where and how should we conduct development? How can we implement release through development, quality assurance and production?

This article aims to help you get started and is based upon the hands-on experience of working with SharePoint 2013 Online during the past 18 months.

Perficient Model

Development and Deployment for SharePoint Online

Develop for the Service

Above all recommendations it is highly advisable to build new features for the service using SharePoint Online. Whether you are writing CSOM, customizing a Master Page or building an App you should do this for the service and not in a local (on-premises) development environment. SharePoint Online offers a very rich API which is very extensible but it can be extremely sobering to realize the feature you just spent the last few weeks building relies upon a feature not available Online. If you are developing features for both Online and On-Premises you can always bring things back on-premises later.

With a MSDN subscription developers can provision their own Office 365 tenant and begin development within a few minutes. How many hours would this have taken for the developer to build their own VM for on-premises development? If the developer does not have an MSDN subscription they could always use a trial tenant on a temporary basis or pay for a single user tenant for indefinite use. When provisioning any new tenant for development ensure that it is under the same license as QA and production (e.g. E3).

Once a developer is ready to deploy and review new features they can do this on a separate Development (Integration) tenant accessible to the team. This Development Environment is typically used for demonstrations of new features, in SCRUM Review meetings for example.

Tenant Isolation

Consistent with any mature software development it is important to ensure that Development, QA and Production are properly isolated and permissions configured accordingly. Developers will most likely have full administrative access to Development but will only have read or less access to QA and Production. Keeping your developers out of Production is a key principle for stability and ensures good consistent deployment techniques are employed. It also ensures that we maintain healthy disagreements between developers and administrators which is as old as time and ensures the project is fun!

It helps to name tenants consistently. We usually use the convention:

  • https://<production tenant name>.sharepoint.com
  • https://<production tenant name>DEV.sharepoint.com
  • https://<production tenant name>QA.sharepoint.com

A key consideration with this isolation is how to maintain accounts across all three environments. Most likely the Production environment will have federated identities synchronized to the cloud with ADFS and DirSync or FIM. This allows us to work with corporate credentials in Production. However, a single domain can only be synchronized to one Office 365 tenant. So what should be configured for Development and QA? It is of course possible to build new domains (on-premises) and mirror the synchronization for Production. This is of course the most pure form of ensuring Development and QA are true representations of Production. However, this may be overkill for your development and testing needs.

It can be advantageous to use cloud accounts (onmicrosoft.com) in Development and QA, they are extremely lightweight and easy to manage as your team grows. Cloud accounts are particularly useful when working with professional services organizations as setup can usually avoid what might otherwise be a lengthy setup process. However, if your solution relies heavily on synchronized identities then it may be necessary to have Development and QA domains which mirror production.

Another key driver for isolating tenants in this way is that it ensures no global configuration changes during development can impact the production system. Consider the configuration of:

  • Content Types
  • Term Store Configuration
  • Tenant-level Search Managed Properties
  • Tenant-level Search Result Sources

One could argue that developing in a single Site Collection isolates development appropriately. However, the misconfiguration of these items alone could easily break a production system and take some time to recover from e.g. Search may need to re-crawl or the Content Type Hub will need to wait for a scheduled push.

Scripted Deployment

This article will not fully elaborate upon Scripted Deployment to SharePoint Online I will write another article shortly on this topic. However, it is an important principle of this model. Automating any task which is repeated can be a productivity benefit providing the time invested in developing the automation takes less time than repeating the task itself. Automation also significantly reduces chance of human error. It is less obvious how to automate deployments for SharePoint Online but the benefits are clear and have paid huge dividend for our teams working with the service.

What is Scripted Deployment? For SharePoint Online this means writing PowerShell with XML configuration and using the SharePoint Server 2013 Client Components SDK (currently v16). The PowerShell is run locally on the developer or administrator’s machine but connects to SharePoint Online using the Client Object Model. Through this script we can deploy most things required for SharePoint Online customization such as:

  • Master Pages
  • Page Layouts
  • Content Types
  • Display Templates
  • Term Store Terms

It has taken some investment in the development of PowerShell modules but these become highly reusable across projects.

As developers work with their own tenant they develop the deployment scripts required for their feature. Those familiar with SCRUM will relate to ‘Done Criteria’. Our Done Criteria includes development of a feature and its scripted deployment to the Development (Integration) tenant where it can be reviewed. There are some exceptions which cannot be achieved by this technique but the Client Object Model does support a very wide range of common needs for deployment and configuration. Where exceptions exist these are documented in a deployment document for manual execution by an administrator.

Replication of Production Data

It is desirable to have recent data available in QA to ensure good and valid testing. For this replication it is advisable to use a third-party migration tool like Metalogix Content Matrix. When selecting a tool for this purpose ensure that it can migrate the data faithfully to ensure good testing but also that it can transform data as required. For example, if Production data uses synchronized identities but QA uses Cloud Accounts it will be necessary to perform some transformation. E.g.

chris.hines@contoso.com could be mapped to chris.hines@contosoqa.onmicrosoft.com

Happy development and deployment!

Advanced analytics in healthcare with Epic, SQL Server and Azure

Over the months we have released a lot of information on building analytic platforms in healthcare. Several members of my team have played key architectural roles in not only implementing the Cogito platform and performing readmission analysis with it, but also expanding the platform to include customer satisfaction data from Press Ganey.

These functions were deemed critical to the initial phases of these projects, but are largely ‘back-end’ architectural projects. They do not address the ad-hoc analysis needs of the business, the delivery technologies available or much less the predictive capabilities that can be added to the platforms.

Fortunately there are a lot of new technologies in the Microsoft stack to address these needs.

As part of our advisory services to help our clients understand what new capabilities they have with their new platforms we regularly build concept visualizations. The following videos are examples of out of the box capabilities we built for one of our clients utilizing:

Self-service analytics with Power Pivot and Power View

3D visualizations with Power Map

And finally natural language query processing in the cloud with Q&A in Power BI

These technologies are well known and are being leveraged within several of our large clients, but a couple of recent announcements from Microsoft introduces even more exciting capabilities.

Power View now supports forecasting. This is a great new add currently available in the HTML5 version of Power View in Power BI. It gives the user the ability to quickly forecast a trend line, account for seasonality and even adjust the confidence intervals of the calculation. Below is a screenshot of some readmission forecasting being performed on the dataset from the earlier videos.

Forecasting

Important to note is that you not only see the forecasted line (light blue lines which runs through the top chart gray box) but the second chart also shows the hindcasting feature which lets a user start a forecast in the past in order to see how accurate it would have been against real data. (light blue line to the left of the gray box in the second chart).

While valuable and easy to use, this technology doesn’t give us the ability to predict who is at risk of readmitting. For that, we need a more powerful tool.

Azure Machine Learning Services is a recently announced cloud service for the budding Data Scientist. Through a drag and drop interface you can now build experiments of predictive models, train and score the models and even evaluate the accuracy of different algorithms within your model.

The screenshot below shows an experiment that was built against the same readmission data used in the forecasting example (Epic Cogito dataset). The dataset was modified to flatten multiple patient admissions onto one record and included the following attributes as well as some others:

Attributes

The experiment was then created to compare two different classification algorithms, a boosted decision tree vs. a logistic regression. *Note that this blog is not intended to debate the accuracy or appropriate use of these particular algorithms. These were just the two I used.

Model

Once the experiment is complete and evaluated a simple visual inspection shows the accuracy gains one algorithm has over the other.

Results

After some tweaking (and this model still needs it) there is a simple process to create a web service with the associated API key which you can use to integrate the model into a readmission prediction application. One that accepts single record or batch inputs.

API

As you can see, there are a number of options for introducing advanced analytics into your healthcare environment. Feel free to contact me with questions on how these tools can be put to work in your new healthcare analytics platform.

Webinar: Content Strategy and a Personalized Digital Experience

As a marketer, the focus on engagement and shift to a more connected, digital experience is incredibly interesting to me. Not long ago, the online experience was fairly simple – you had a website, and you pointed your customers to that site. Your static content was adequate at the time.Personalization

Fast forward to 2014… what worked back when we partied like it’s 1999 (oh wait, it was) isn’t going to cut it today. According to Internet World Stats, as 2013 came to an end, there were over 2.8 billion people in the world online. In 2000, there were less than 400 million online. For the sake of comparison, the world population is around 7 billion today.

Needless to say, that’s a lot of eyes on your .com site. And on that site, we’ve got articles, blog posts, comments, infographics, audio, video, and images, to name a few. It’s easy to be hyper focused on content creation to the point that we lose sight of whether or not that content is even relevant to our target demographic.

Enter a content strategy. This will help you figure out which content type to use where, allowing you to both personalize and enrich your users’ digital experience. Join us on Wednesday, August 27, 2014 at 1 p.m. CT for a webinar, Using the Right Content Strategy to Create a Personalized Digital Experience to learn about some key content strategies, best practices and ways to create great content to keep your users coming back. We’ll also discuss how tools like Sitecore can help drive the personalized digital experience.

During the webinar, you’ll hear from Jason Maloney, Director of Perficient XD, Michael Porter, Principal of Portal, Web Content and Social Solutions at Perficient, and Mark Gehman, Perficient’s Sitecore Practice Director. Together, they’ll share a lot of actionable tips to get you started on creating or improving your content strategy.

To register for the webinar, click here.
Using the Right Content Strategy to Create a Personalized Digital Experience
Wednesday, August 27, 2014
1:00 p.m. CDT

 

Making sense of the recent Internet Explorer announcement

Last week, Perficient’s Zach Handing wrote a post over on our Spark blog explaining what to make of the recent Internet Explorer announcement published on Microsoft’s Internet Explorer blog. In the article, Microsoft discussed their plans for supporting older versions of IE. internet-explorer-8-logoThere was quite a bit of racket across the web, as people interpreted the information in different ways, facts quickly turned into exaggerations, or straight fiction. As Zach wrote:

I have seen many eager Interneters making loud claims to the tune of, “IE8 is dead!  We no longer have to support older versions of IE!”  However, it’s very easy to get caught up in the pandemonium or start bandwagon-ing and miss the actual facts of what is and will be happening according to Microsoft.  I want to clarify some things and set the record straight before we all hang up our Windows XP virtual machines.

What did Microsoft write to cause this, you ask? From the article:

After January 12, 2016, only the most recent version of Internet Explorer available for a supported operating system will receive technical support and security updates.

Zach goes on to explain that there are two important things we can learn from this quote that are worth noting, one of which is the following:

The first is that Microsoft is only stating that they plan to stop providing technical support and security updates for all versions of IE except the most current available for each of their operating systems.  The table below shows exactly which versions they mean.

Windows Platform Internet Explorer Version
Windows Vista SP2 Internet Explorer 9
Windows Server 2008 SP2 Internet Explorer 9
Windows 7 SP1 Internet Explorer 11
Windows Server 2008 R2 SP1 Internet Explorer 11
Windows 8.1 Internet Explorer 11
Windows Server 2012 Internet Explorer 10
Windows Server 2012 R2 Internet Explorer 11

 

So where is Internet Explorer 8 in that table? What does the fact that it is missing mean?

…that doesn’t mean IE8 is going away.  All this means is that Microsoft is not going to provide updates or support for IE8 anymore; it does not mean that people are going to magically stop using it.  The article also mentions that “Microsoft recommends enabling automatic updates to ensure an up-to-date computing experience”, but recommending that it happens does not mean that everyone will do it.  Yes, this is a big leap towards a day when developers do not need to worry about IE8 specific styles, but that day is not here yet.

So what’s the second big part? Zach tells us to take a look at that date… January 12, 2016. That’s pretty far in the future… approximately a year and a half. So for the next eighteen months, Internet Explorer 8 will still be alive and kicking, as Microsoft will still be supporting and providing updates for the version. And after that, Internet Explorer will still be around.

You can read Zach’s full post here on our Spark blog. The Spark blog is Perficient’s perspective on all things innovative, and the crew that blogs over there has been posting some really interesting stuff around UX, UI and design. Check them out!

Enterprise Social and its Three Most Dangerous Myths

Enterprise social.  It’s not the greatest thing since sliced bread (The Beatles, frozen custard and computer-generated animation all make stronger cases for that title) but lately it seems like it’s close.  That said, for all of its growing popularity, our experience tells us that a good deal of what people believe about social networks– and how to get users engaged on them– is flat-out wrong.

sliced_breadAfter a summer so busy that I missed my July deadline, this morning I published a new article on CMSWire addressing three of enterprise social’s most dangerous myths head-on.  Check it out and learn why…

  • A single network might not be your best bet for adoption…
  • #ESN rollouts aren’t like any other new application, and…
  • Some users will simply never adopt them, no matter what you do.

Interested in continuing the conversation?  I’ll be unofficially representing Perficient (which cannot be held responsible for any comparisons I make between Yammer and Game of Thrones) in CMSWire’s upcoming TweetJam (yes, that’s a thing) on this month’s editorial focus “What does working like a network look like in practice?” The Tweet Jam will take place on Wednesday, August 20 at 1pm ET/ 10am PT.  You can find me on Twitter at @richOthewood; follow @CMSWire for TweetJam details and the #official #hashtag to follow.

Insights on SQL Server 2014 Data Warehousing Edition

For anyone that is thinking about selecting the Data Warehouse edition of SQL Server 2014, I just want to highlight a few things required to install this product and get it up and running.

First off though, what is SQL 2014 DataWarehousing Edition? In short, it is a version of SQL server that  is available as an image on an Azure VM, the product seems to be flying a little bit under the radar.  In terms of licensing and features, it is closest to Enterprise Edition and is similar to BI Edition.  It houses the full stack of BI products, and it also allows for database snapshots like Enterprise Edition.  The biggest single difference I can find is that it is optimized to use Azure Storage in the cloud-interesting no?  I see its primary purpose as replacing an existing on premise data warehouse, or to function as a starting point for a new data warehouse that will be fairly large.

SQL Server 2014 EvolutionI won’t go into provisioning a cloud VM in this blog, but if you want more info here’s a link:

http://msdn.microsoft.com/library/dn387396.aspx

Ok, on to some tech points and what to expect:

First and foremost, this edition’s minimum recommended VM size is an A7-whoah!

Pretty steep for a minimum spec, A7 is 8 cores with 56 GBs of RAM.  We all know minimum specs are just that, the bare minimum, and usually we end up going larger.

If you are unfamiliar with Azure VM sizing take a look here:

http://msdn.microsoft.com/en-us/library/azure/dn197896.aspx

Second, even to do a basic install, it is going to require that you have several 1 Terabyte Storage locations available for it to harness in Azure Storage. Double Whoah!

When you first login to this VM, you will not be able to connect SSMS to the SQL instance. Instead you are prompted to configure Storage containers for SQL 2014 DW Edition. This can be done in the Portal, or it can be done via Azure PowerShell and is documented quite well here:

http://msdn.microsoft.com/library/dn387397.aspx

In a nutshell it is quite easy to attach the disks through the Portal Application on Azure, you just browse to your VM and click “Attach” at the bottom of the screen.  The VM will reboot, and you can then confirm the process in the logs listed in the link above.  But as I mentioned earlier, you will know when it is up and running because you will get a login error from SSMS if it is not properly setup.  One thing to keep in mind is that LUNS are numbered 0-X  not 1-X, I made this mistake when I first read the log  and thinking it was complete when I still needed to attach one more disk.

Once you have configured the appropriate number of storage LUNS, you must then use Disk Manager in Windows to format and label them – E: , F:, G:, etc.

Once the SQL Instance finds its required number of storage containers, it will then and only then, allow you to login via SSMS.

So what is going on here? Well, some good stuff in my opinion.

  1. It is forcing the end user to appropriate several disk locations instead of just using the default c:\ drive to store everything. This is a great optimization because it will spread the disk activity out over multiple LUNS. It also enforces separating the data files from the operating system disk and the page files. Think about how many database systems you have worked on that have this design flaw-a lot of them.
  2. It is assuming you mean business and it requires a massive amount of storage up front to even install it. Either you need this edition of SQL Server or you don’t. This is not SQL Express or a departmental application server, this is a full size enterprise application that is capable of migrating an on premise DW to Azure.

Even though one might be put off a bit that it requires 4+ terabytes of storage to install, I actually like the fact that it enforces good design and automatically gives some overhead for growth.

No hardware budget excuses this time, a very important point, is that even though it requires you to appropriate 4+ TB’s of storage, YOU ARE NOT BILLED FOR THE STORAGE YOU APPROPRIATE, you are only billed for the storage that you actually fill with data.

Once you understand that, this product starts making more sense. You can design a large storage location, with plenty of room to grow, without having to buy a large storage location. In a traditional on premise environment, this could mean forking over some major cash. If you have never noticed, SANs are not inexpensive, and they take a long time to arrive onsite!

In summary,  I am glad that this product is designed the way it is. It enforces good design from the beginning. It is not the correct product for a lot of different applications due to its scale, but for the person or place that wants to migrate or build a true Enterprise size data warehouse in Azure, SQL 2014 DW Edition is perfect.

 

 

Multi-Forest Identity Solution – Azure AD Sync

 

microsoft-azure-logo

Last week I wrote about strategic benefits with Microsoft Azure and included some market research of other big cloud competitors. Continuing on that and in part 2 of this series I will talk about the one of the most awaited multi-forest identity solutions – Azure Active Directory Sync Tool

In April this year, Microsoft announced set of great new identity synchronization features available in preview. Including password write back, Azure AD Sync (AAD Sync), and multi-forest support. Working with customers with multiple on-premises Active Directory forests and multiple on-premises Exchange organizations wanting to migrate to Exchange Online using a hybrid deployment it’s not been a trivial approach implementing Forefront Identity Manager (FIM). FIM provides self-service identity management for users and a framework to enforce security policies. FIM implementation isn’t trivial and cost effective for many Office 365 scenarios and as a result I experienced customers with complex multi forest environments turning their backs to Microsoft and going after other vendors.

Customers with single forest typically relied on DirSync which is really a downsized version of FIM. Although a clean and easy setup DirSync suffers from a number of limitations. The most painful for large companies being the fact that it only synchronizes identity data from one forest to Azure AD. The other drawbacks includes creating an Office 365 account  for all Active Directory users of a particular OU and minimal control over the user object.dirsync

Hence on the path to bridge gaps and prompted by the need for in-the-cloud password replication back to the on-premises AD( their users log on to every day), Microsoft released “DirSync with password reset write-back”. It’s part of the Azure AD Premium offering which allows users to reset their Azure AD user account password via the “MyApps” web portal. Now came the need to address multi-forest synchronization and greater control over configuration. This lead to the next big announcement from Microsoft – Azure Active Directory Sync (AAD Sync).

AADSync has its underpinnings from components of Microsoft’s Forefront Identity Manager (FIM) metadirectory service, so its architecture is similar to both DirSync and FIM. You connect your active directory forests to AADSync via a connector. Like FIM and other meta directory services, these connectors feed into an aggregated store that contains a consolidated view of all the inbound identities. It’s this view that AADSync replicates to Azure AD. With Microsoft making progress with AAD Sync preview versions, partners and customers are now anxiously waiting for a public release to help them address their multi-forest identity needs.

 

aadsync1                                                                                   (Fig: AADSync account resource forest scenario-image source: Microsoft)

Just today Microsoft announced another version – AAD Sync Beta 3 with investments in hybrid exchange and multi-forest configuration by adding the multi-forest password write-back capabilities. Check out the installation guide for more details http://social.technet.microsoft.com/wiki/contents/articles/24057.aadsync-installation-guide.aspx.  AAD Sync will allow customers to

Onboard their multi-forest Active Directory deployment to AAD

  1. Advanced provisioning, mapping and filtering rules for objects and attributes, including support for syncing a very minimal set of user attributes
  2. Configuring multiple on-premises Exchange organizations to map to a single AAD tenant
  3. Selective synchronization which enables you to only sync attributes required for the services you want to enable.
  4. AD password reset with multi-forests.
  5. Exchange hybrid deployment in multi-forests environments which enables you to have mailboxes in Office 365 as well as in your on-premises exchange.

 

An integrated on-premises / cloud identity directory is a key piece of Microsoft’s Cloud OS vision and this goes to show their commitment to cloud first, mobile first strategy.

How To Use OneNote Like A Pro!

I’ve been using OneNote for years and it’s great! OneNote has 3 primary segments that I want to define first – Notebooks, Sections, Pages. MSDN has a great article to get you started -

A OneNote notebook is just like a regular spiral notebook: It’s where you pile all your, um, notes. But unlike a regular paper notebook, you can add, move, and delete anything you want. It’s very forgiving (no ripped pages, no scratched-out phone numbers of old girlfriends), and you can organize and separate your notes by sections and pages.

imagesX0DUG9LDFor me, I have 10 different Notebooks. I have one that is local on my PC and the rest are in the cloud, including multiple SharePoint 2013/SharePoint Online Shared Notebooks. In my primary Notebook, I have 100+ sections – each representing a different client I’ve worked with over my time at Perficient. My cloud notebooks are mostly project or group specific notebooks serving a specific function. Inside each section, I have between 2-50+ pages. For me, each page represents a particular meeting or a different topic.

This level of separation is exactly what I need. Between Notebooks, Sections, and Pages, I can organize my notes in any way that I like. This structure is completely expandable and it’s only limit is the storage capacity on the device where your notebook resides.

To get started with OneNote, here’s a good reference of basic tasks – Create a new notebook, Type or write notes, Add links, Add files, Add pictures, Draw table, Add sections, Add pages.

Also, OneNote works on any device! If you have an iPhone, Android Phone, or Windows Phone there is a free OneNote app. OneNote also has a Windows Store app and an iPad app. If you’re using a desktop (Windows or Mac) and want to read or write notes without installing anything, you can. Sign in at www.office.com to see a list of all your Office documents, including your OneNote notebooks. Click a notebook and it will open in the OneNote web app. There are web apps for Word, PowerPoint and Excel too!

Things You Didn’t Know About OneNote

1. OneNote can read handwritten text! Read the rest of this post »

Tags:

Posted in Office

Microsoft Server 2003 to 2012R2 – More than just end of Life

With the end of life fast approaching, on July 14 2015, for Microsoft Server 2003 it will be hard for many organizations to make the move to a new Server Operating System, not unlike the pain many organizations are feeling with the move from Microsoft Windows XP.

End-Is-Ahead-Graphic-sm-570x350There are many business related reasons that companies need to start now with their migration to server 2012R2. For example when customers made the move from Windows XP, many found they should have planned more in advance, because many migrations can take 8 months or longer depending on the size and complexity of the environment. Security alone should be a big enough business reason to move to a supported platform, in 2013 Microsoft released 37 critical updates for Windows Server 2003, once end of life happens there will not be any more patches released.  By not patching the server environment, you now run the risk malicious attacks, system bugs and PCI compliance.

The good news is that while the move might be painful,  in the long run it will be worth the trouble. Microsoft Server 2012R2 offers so many enhancements and new features, that once you have completed the migration and become familiar with Microsoft Server 2012R2 you will probably wonder why you waited so long.

Microsoft Server 2012R2 offers many enhancements, including

  • PowerShell 4.0 – PowerShell 3.0 alone has 2300 more cmdlets than PowerShell 2.0
  • Hyper-V 3.0 – Supports 64 processors and 1Tb of Memory. Also supports VHDX format for large disk capacity and live migrations
  • SMB 3.02 – Server 2003 supports SMB 1.0
  • Work Folders – Brings the functionality of Dropbox to your corporate servers
  • Desired State Configuration – Lets you maintain server configuration across the board with baselines
  • Storage Tiering – Dynamically move chunks of stored data between slower and higher drives
  • Data Deduplication – Data compression and now with Server 2012R2 you can run Data Deduplication on Virtual Machines also is great for VDI environments.
  • Workplace Join – Allows users to register personal devices with Active Directory gain certificate based authentication and single sign on to the domain.

You can see from just these features how far Microsoft Server OS has come over the last 10 years. Scalability, Speed, Virtualization, Mobile Device Management and Cloud Computing have been vastly improved or were not possible with Microsoft Server 2003.

With  current trends moving towards organizations embracing a user centric environment and moving to cloud computing, Server 2012R2 is a stepping stone in the right direction.

So while the migration to Microsoft Server 2012R2 may be painful, all will be forgotten once the organization and Server Administrators, can utilize the new features and notice the new ease of daily management activities.