Microsoft

Blog Categories

Subscribe to RSS feed

Archives

Follow our Microsoft Technologies board on Pinterest

Archive for the ‘Cloud’ Category

Bing Your Way To Success – Tips Every Programmer Should Live By

When I was in school, I remember studying learning styles - “series of theories suggesting systematic differences in individuals’ natural or habitual pattern of acquiring and processing information in learning situations.” I was the always the Converger, very hands-on, figuring out things for myself, testing theories. For me, this started at an early age. I can remember being one of the first students in middle school to harness the power of the internet around 1992 – 1994. I remember discovering Lexus Nexis, Alta Vista, and later Yahoo to read academic papers and abstracts. Writings, facts, opinions, that just weren’t available in my school library, were now available on the computer. I learned how to draw information at my fingertips by using search engines. Even in the early days, this was way more informative than an old encyclopedia and way more fun!bing

Fast forward to my college years, my search engine skills continued to progress. As I learned C++, VB Script, and Java, I relied heavily on the internet for the most up to date information on techniques, theory, and examples. Books simply couldn’t keep up with the power of the internet and its ever growing database of information. It was a great way for me to learn and get through college; and it continues to be a very sharp tool in my tool belt today.

In this post, I’m going to show you a few search engine tricks so you can Bing your way to success!

1. Use Quotes to Find Exact Results

Read the rest of this post »

Webinar Recap: Migrating to SharePoint Online with @jasonbell

There’s a lot of interest around moving to the cloud, and specifically, SharePoint Online. Because of that, we’ve had several webinars over the summer that focus on SharePoint Online and SharePoint in a hybrid environment (you can view all our past Microsoft webinars here, beginning with the most recent). headache2

Despite that interest, migrations can be a bit of a headache (or in some cases, a debilitating migraine). But, if you do your research and plan properly, the process can be a fairly smooth one – possibly even your last, since once in the cloud, you shouldn’t need to do intensive upgrades or migrations in the future.

Last week, we held another session around SharePoint Online, this time focusing on Best Practices for a Successful SharePoint Migration or Upgrade to the Cloud.  My colleague, Jason Bell, a senior solution architect within our Microsoft practice, kicked off the webinar with the top reasons to move to SharePoint Online. Following this, he shared migration methodology, which includes your migration assessment, migration development, and the actual migration plan.

Next, Jason talked about the different migration approaches – manual, scripted, or the use of a third party tool like AvePoint, Metalogix, or Sharegate. He wrapped up with a discussion around secure cloud computing, including information rights management and the use of Office Web Apps.

For the full webinar replay, including an informative Q&A portion where Jason answered a ton of attendee questions, click here. You can also catch up with Jason on Twitter @jasonbell.

 

How to develop and deploy for SharePoint Online (Office 365)

With the rapidly evolving migration to the cloud SharePoint teams are faced with a new challenge: How do we develop and deploy for SharePoint Online?

If your feet have been firmly planted with on-premises development for SharePoint it can be a little daunting trying to move your process to the cloud. Where and how should we conduct development? How can we implement release through development, quality assurance and production?

This article aims to help you get started and is based upon the hands-on experience of working with SharePoint 2013 Online during the past 18 months.

Perficient Model

Development and Deployment for SharePoint Online

Develop for the Service

Above all recommendations it is highly advisable to build new features for the service using SharePoint Online. Whether you are writing CSOM, customizing a Master Page or building an App you should do this for the service and not in a local (on-premises) development environment. SharePoint Online offers a very rich API which is very extensible but it can be extremely sobering to realize the feature you just spent the last few weeks building relies upon a feature not available Online. If you are developing features for both Online and On-Premises you can always bring things back on-premises later.

With a MSDN subscription developers can provision their own Office 365 tenant and begin development within a few minutes. How many hours would this have taken for the developer to build their own VM for on-premises development? If the developer does not have an MSDN subscription they could always use a trial tenant on a temporary basis or pay for a single user tenant for indefinite use. When provisioning any new tenant for development ensure that it is under the same license as QA and production (e.g. E3).

Once a developer is ready to deploy and review new features they can do this on a separate Development (Integration) tenant accessible to the team. This Development Environment is typically used for demonstrations of new features, in SCRUM Review meetings for example.

Tenant Isolation

Consistent with any mature software development it is important to ensure that Development, QA and Production are properly isolated and permissions configured accordingly. Developers will most likely have full administrative access to Development but will only have read or less access to QA and Production. Keeping your developers out of Production is a key principle for stability and ensures good consistent deployment techniques are employed. It also ensures that we maintain healthy disagreements between developers and administrators which is as old as time and ensures the project is fun!

It helps to name tenants consistently. We usually use the convention:

  • https://<production tenant name>.sharepoint.com
  • https://<production tenant name>DEV.sharepoint.com
  • https://<production tenant name>QA.sharepoint.com

A key consideration with this isolation is how to maintain accounts across all three environments. Most likely the Production environment will have federated identities synchronized to the cloud with ADFS and DirSync or FIM. This allows us to work with corporate credentials in Production. However, a single domain can only be synchronized to one Office 365 tenant. So what should be configured for Development and QA? It is of course possible to build new domains (on-premises) and mirror the synchronization for Production. This is of course the most pure form of ensuring Development and QA are true representations of Production. However, this may be overkill for your development and testing needs.

It can be advantageous to use cloud accounts (onmicrosoft.com) in Development and QA, they are extremely lightweight and easy to manage as your team grows. Cloud accounts are particularly useful when working with professional services organizations as setup can usually avoid what might otherwise be a lengthy setup process. However, if your solution relies heavily on synchronized identities then it may be necessary to have Development and QA domains which mirror production.

Another key driver for isolating tenants in this way is that it ensures no global configuration changes during development can impact the production system. Consider the configuration of:

  • Content Types
  • Term Store Configuration
  • Tenant-level Search Managed Properties
  • Tenant-level Search Result Sources

One could argue that developing in a single Site Collection isolates development appropriately. However, the misconfiguration of these items alone could easily break a production system and take some time to recover from e.g. Search may need to re-crawl or the Content Type Hub will need to wait for a scheduled push.

Scripted Deployment

This article will not fully elaborate upon Scripted Deployment to SharePoint Online I will write another article shortly on this topic. However, it is an important principle of this model. Automating any task which is repeated can be a productivity benefit providing the time invested in developing the automation takes less time than repeating the task itself. Automation also significantly reduces chance of human error. It is less obvious how to automate deployments for SharePoint Online but the benefits are clear and have paid huge dividend for our teams working with the service.

What is Scripted Deployment? For SharePoint Online this means writing PowerShell with XML configuration and using the SharePoint Server 2013 Client Components SDK (currently v16). The PowerShell is run locally on the developer or administrator’s machine but connects to SharePoint Online using the Client Object Model. Through this script we can deploy most things required for SharePoint Online customization such as:

  • Master Pages
  • Page Layouts
  • Content Types
  • Display Templates
  • Term Store Terms

It has taken some investment in the development of PowerShell modules but these become highly reusable across projects.

As developers work with their own tenant they develop the deployment scripts required for their feature. Those familiar with SCRUM will relate to ‘Done Criteria’. Our Done Criteria includes development of a feature and its scripted deployment to the Development (Integration) tenant where it can be reviewed. There are some exceptions which cannot be achieved by this technique but the Client Object Model does support a very wide range of common needs for deployment and configuration. Where exceptions exist these are documented in a deployment document for manual execution by an administrator.

Replication of Production Data

It is desirable to have recent data available in QA to ensure good and valid testing. For this replication it is advisable to use a third-party migration tool like Metalogix Content Matrix. When selecting a tool for this purpose ensure that it can migrate the data faithfully to ensure good testing but also that it can transform data as required. For example, if Production data uses synchronized identities but QA uses Cloud Accounts it will be necessary to perform some transformation. E.g.

chris.hines@contoso.com could be mapped to chris.hines@contosoqa.onmicrosoft.com

Happy development and deployment!

Advanced analytics in healthcare with Epic, SQL Server and Azure

Over the months we have released a lot of information on building analytic platforms in healthcare. Several members of my team have played key architectural roles in not only implementing the Cogito platform and performing readmission analysis with it, but also expanding the platform to include customer satisfaction data from Press Ganey.

These functions were deemed critical to the initial phases of these projects, but are largely ‘back-end’ architectural projects. They do not address the ad-hoc analysis needs of the business, the delivery technologies available or much less the predictive capabilities that can be added to the platforms.

Fortunately there are a lot of new technologies in the Microsoft stack to address these needs.

As part of our advisory services to help our clients understand what new capabilities they have with their new platforms we regularly build concept visualizations. The following videos are examples of out of the box capabilities we built for one of our clients utilizing:

Self-service analytics with Power Pivot and Power View

3D visualizations with Power Map

And finally natural language query processing in the cloud with Q&A in Power BI

These technologies are well known and are being leveraged within several of our large clients, but a couple of recent announcements from Microsoft introduces even more exciting capabilities.

Power View now supports forecasting. This is a great new add currently available in the HTML5 version of Power View in Power BI. It gives the user the ability to quickly forecast a trend line, account for seasonality and even adjust the confidence intervals of the calculation. Below is a screenshot of some readmission forecasting being performed on the dataset from the earlier videos.

Forecasting

Important to note is that you not only see the forecasted line (light blue lines which runs through the top chart gray box) but the second chart also shows the hindcasting feature which lets a user start a forecast in the past in order to see how accurate it would have been against real data. (light blue line to the left of the gray box in the second chart).

While valuable and easy to use, this technology doesn’t give us the ability to predict who is at risk of readmitting. For that, we need a more powerful tool.

Azure Machine Learning Services is a recently announced cloud service for the budding Data Scientist. Through a drag and drop interface you can now build experiments of predictive models, train and score the models and even evaluate the accuracy of different algorithms within your model.

The screenshot below shows an experiment that was built against the same readmission data used in the forecasting example (Epic Cogito dataset). The dataset was modified to flatten multiple patient admissions onto one record and included the following attributes as well as some others:

Attributes

The experiment was then created to compare two different classification algorithms, a boosted decision tree vs. a logistic regression. *Note that this blog is not intended to debate the accuracy or appropriate use of these particular algorithms. These were just the two I used.

Model

Once the experiment is complete and evaluated a simple visual inspection shows the accuracy gains one algorithm has over the other.

Results

After some tweaking (and this model still needs it) there is a simple process to create a web service with the associated API key which you can use to integrate the model into a readmission prediction application. One that accepts single record or batch inputs.

API

As you can see, there are a number of options for introducing advanced analytics into your healthcare environment. Feel free to contact me with questions on how these tools can be put to work in your new healthcare analytics platform.

Enterprise Social and its Three Most Dangerous Myths

Enterprise social.  It’s not the greatest thing since sliced bread (The Beatles, frozen custard and computer-generated animation all make stronger cases for that title) but lately it seems like it’s close.  That said, for all of its growing popularity, our experience tells us that a good deal of what people believe about social networks– and how to get users engaged on them– is flat-out wrong.

sliced_breadAfter a summer so busy that I missed my July deadline, this morning I published a new article on CMSWire addressing three of enterprise social’s most dangerous myths head-on.  Check it out and learn why…

  • A single network might not be your best bet for adoption…
  • #ESN rollouts aren’t like any other new application, and…
  • Some users will simply never adopt them, no matter what you do.

Interested in continuing the conversation?  I’ll be unofficially representing Perficient (which cannot be held responsible for any comparisons I make between Yammer and Game of Thrones) in CMSWire’s upcoming TweetJam (yes, that’s a thing) on this month’s editorial focus “What does working like a network look like in practice?” The Tweet Jam will take place on Wednesday, August 20 at 1pm ET/ 10am PT.  You can find me on Twitter at @richOthewood; follow @CMSWire for TweetJam details and the #official #hashtag to follow.

Insights on SQL Server 2014 Data Warehousing Edition

For anyone that is thinking about selecting the Data Warehouse edition of SQL Server 2014, I just want to highlight a few things required to install this product and get it up and running.

First off though, what is SQL 2014 DataWarehousing Edition? In short, it is a version of SQL server that  is available as an image on an Azure VM, the product seems to be flying a little bit under the radar.  In terms of licensing and features, it is closest to Enterprise Edition and is similar to BI Edition.  It houses the full stack of BI products, and it also allows for database snapshots like Enterprise Edition.  The biggest single difference I can find is that it is optimized to use Azure Storage in the cloud-interesting no?  I see its primary purpose as replacing an existing on premise data warehouse, or to function as a starting point for a new data warehouse that will be fairly large.

SQL Server 2014 EvolutionI won’t go into provisioning a cloud VM in this blog, but if you want more info here’s a link:

http://msdn.microsoft.com/library/dn387396.aspx

Ok, on to some tech points and what to expect:

First and foremost, this edition’s minimum recommended VM size is an A7-whoah!

Pretty steep for a minimum spec, A7 is 8 cores with 56 GBs of RAM.  We all know minimum specs are just that, the bare minimum, and usually we end up going larger.

If you are unfamiliar with Azure VM sizing take a look here:

http://msdn.microsoft.com/en-us/library/azure/dn197896.aspx

Second, even to do a basic install, it is going to require that you have several 1 Terabyte Storage locations available for it to harness in Azure Storage. Double Whoah!

When you first login to this VM, you will not be able to connect SSMS to the SQL instance. Instead you are prompted to configure Storage containers for SQL 2014 DW Edition. This can be done in the Portal, or it can be done via Azure PowerShell and is documented quite well here:

http://msdn.microsoft.com/library/dn387397.aspx

In a nutshell it is quite easy to attach the disks through the Portal Application on Azure, you just browse to your VM and click “Attach” at the bottom of the screen.  The VM will reboot, and you can then confirm the process in the logs listed in the link above.  But as I mentioned earlier, you will know when it is up and running because you will get a login error from SSMS if it is not properly setup.  One thing to keep in mind is that LUNS are numbered 0-X  not 1-X, I made this mistake when I first read the log  and thinking it was complete when I still needed to attach one more disk.

Once you have configured the appropriate number of storage LUNS, you must then use Disk Manager in Windows to format and label them – E: , F:, G:, etc.

Once the SQL Instance finds its required number of storage containers, it will then and only then, allow you to login via SSMS.

So what is going on here? Well, some good stuff in my opinion.

  1. It is forcing the end user to appropriate several disk locations instead of just using the default c:\ drive to store everything. This is a great optimization because it will spread the disk activity out over multiple LUNS. It also enforces separating the data files from the operating system disk and the page files. Think about how many database systems you have worked on that have this design flaw-a lot of them.
  2. It is assuming you mean business and it requires a massive amount of storage up front to even install it. Either you need this edition of SQL Server or you don’t. This is not SQL Express or a departmental application server, this is a full size enterprise application that is capable of migrating an on premise DW to Azure.

Even though one might be put off a bit that it requires 4+ terabytes of storage to install, I actually like the fact that it enforces good design and automatically gives some overhead for growth.

No hardware budget excuses this time, a very important point, is that even though it requires you to appropriate 4+ TB’s of storage, YOU ARE NOT BILLED FOR THE STORAGE YOU APPROPRIATE, you are only billed for the storage that you actually fill with data.

Once you understand that, this product starts making more sense. You can design a large storage location, with plenty of room to grow, without having to buy a large storage location. In a traditional on premise environment, this could mean forking over some major cash. If you have never noticed, SANs are not inexpensive, and they take a long time to arrive onsite!

In summary,  I am glad that this product is designed the way it is. It enforces good design from the beginning. It is not the correct product for a lot of different applications due to its scale, but for the person or place that wants to migrate or build a true Enterprise size data warehouse in Azure, SQL 2014 DW Edition is perfect.

 

 

Multi-Forest Identity Solution – Azure AD Sync

 

microsoft-azure-logo

Last week I wrote about strategic benefits with Microsoft Azure and included some market research of other big cloud competitors. Continuing on that and in part 2 of this series I will talk about the one of the most awaited multi-forest identity solutions – Azure Active Directory Sync Tool

In April this year, Microsoft announced set of great new identity synchronization features available in preview. Including password write back, Azure AD Sync (AAD Sync), and multi-forest support. Working with customers with multiple on-premises Active Directory forests and multiple on-premises Exchange organizations wanting to migrate to Exchange Online using a hybrid deployment it’s not been a trivial approach implementing Forefront Identity Manager (FIM). FIM provides self-service identity management for users and a framework to enforce security policies. FIM implementation isn’t trivial and cost effective for many Office 365 scenarios and as a result I experienced customers with complex multi forest environments turning their backs to Microsoft and going after other vendors.

Customers with single forest typically relied on DirSync which is really a downsized version of FIM. Although a clean and easy setup DirSync suffers from a number of limitations. The most painful for large companies being the fact that it only synchronizes identity data from one forest to Azure AD. The other drawbacks includes creating an Office 365 account  for all Active Directory users of a particular OU and minimal control over the user object.dirsync

Hence on the path to bridge gaps and prompted by the need for in-the-cloud password replication back to the on-premises AD( their users log on to every day), Microsoft released “DirSync with password reset write-back”. It’s part of the Azure AD Premium offering which allows users to reset their Azure AD user account password via the “MyApps” web portal. Now came the need to address multi-forest synchronization and greater control over configuration. This lead to the next big announcement from Microsoft – Azure Active Directory Sync (AAD Sync).

AADSync has its underpinnings from components of Microsoft’s Forefront Identity Manager (FIM) metadirectory service, so its architecture is similar to both DirSync and FIM. You connect your active directory forests to AADSync via a connector. Like FIM and other meta directory services, these connectors feed into an aggregated store that contains a consolidated view of all the inbound identities. It’s this view that AADSync replicates to Azure AD. With Microsoft making progress with AAD Sync preview versions, partners and customers are now anxiously waiting for a public release to help them address their multi-forest identity needs.

 

aadsync1                                                                                   (Fig: AADSync account resource forest scenario-image source: Microsoft)

Just today Microsoft announced another version – AAD Sync Beta 3 with investments in hybrid exchange and multi-forest configuration by adding the multi-forest password write-back capabilities. Check out the installation guide for more details http://social.technet.microsoft.com/wiki/contents/articles/24057.aadsync-installation-guide.aspx.  AAD Sync will allow customers to

Onboard their multi-forest Active Directory deployment to AAD

  1. Advanced provisioning, mapping and filtering rules for objects and attributes, including support for syncing a very minimal set of user attributes
  2. Configuring multiple on-premises Exchange organizations to map to a single AAD tenant
  3. Selective synchronization which enables you to only sync attributes required for the services you want to enable.
  4. AD password reset with multi-forests.
  5. Exchange hybrid deployment in multi-forests environments which enables you to have mailboxes in Office 365 as well as in your on-premises exchange.

 

An integrated on-premises / cloud identity directory is a key piece of Microsoft’s Cloud OS vision and this goes to show their commitment to cloud first, mobile first strategy.

Microsoft Server 2003 to 2012R2 – More than just end of Life

With the end of life fast approaching, on July 14 2015, for Microsoft Server 2003 it will be hard for many organizations to make the move to a new Server Operating System, not unlike the pain many organizations are feeling with the move from Microsoft Windows XP.

End-Is-Ahead-Graphic-sm-570x350There are many business related reasons that companies need to start now with their migration to server 2012R2. For example when customers made the move from Windows XP, many found they should have planned more in advance, because many migrations can take 8 months or longer depending on the size and complexity of the environment. Security alone should be a big enough business reason to move to a supported platform, in 2013 Microsoft released 37 critical updates for Windows Server 2003, once end of life happens there will not be any more patches released.  By not patching the server environment, you now run the risk malicious attacks, system bugs and PCI compliance.

The good news is that while the move might be painful,  in the long run it will be worth the trouble. Microsoft Server 2012R2 offers so many enhancements and new features, that once you have completed the migration and become familiar with Microsoft Server 2012R2 you will probably wonder why you waited so long.

Microsoft Server 2012R2 offers many enhancements, including

  • PowerShell 4.0 – PowerShell 3.0 alone has 2300 more cmdlets than PowerShell 2.0
  • Hyper-V 3.0 – Supports 64 processors and 1Tb of Memory. Also supports VHDX format for large disk capacity and live migrations
  • SMB 3.02 – Server 2003 supports SMB 1.0
  • Work Folders – Brings the functionality of Dropbox to your corporate servers
  • Desired State Configuration – Lets you maintain server configuration across the board with baselines
  • Storage Tiering – Dynamically move chunks of stored data between slower and higher drives
  • Data Deduplication – Data compression and now with Server 2012R2 you can run Data Deduplication on Virtual Machines also is great for VDI environments.
  • Workplace Join – Allows users to register personal devices with Active Directory gain certificate based authentication and single sign on to the domain.

You can see from just these features how far Microsoft Server OS has come over the last 10 years. Scalability, Speed, Virtualization, Mobile Device Management and Cloud Computing have been vastly improved or were not possible with Microsoft Server 2003.

With  current trends moving towards organizations embracing a user centric environment and moving to cloud computing, Server 2012R2 is a stepping stone in the right direction.

So while the migration to Microsoft Server 2012R2 may be painful, all will be forgotten once the organization and Server Administrators, can utilize the new features and notice the new ease of daily management activities.

 

 

 

OneDrive – 9 Tips & Tricks the Pro’s Use

OneDrive is awesome! I love being able to access my files from any device at any location – home, work, mobile, tablet. The last few months, Microsoft has continually upgraded the service. As more and more users are getting familiar with OneDrive, I wanted to compile a list of my favorite tips and tricks to get you up to speed using OneDrive like a Pro!

1. Use Version Controlonedrive

For those who don’t know the history of OneDrive, it spawned from the MySites concept in SharePoint 2007 and 2010. Which means that a lot of great SharePoint functionality is available in OneDrive – like Version Control. If you collaborate on documents with many people through OneDrive, it helps to know what changes were made and who made them. To view older versions, go to the OneDrive website, right click the file, select Version History from the drop down menu. You’ll be able to see previous versions of the file, who made changes, and when. OneDrive will store up to 25 versions of a file!

2. Move the OneDrive Folder

In some instances, you might not prefer the default location for the OneDrive folder – which appears as a subfolder in your user profile folder. You can move the folder at any time on any device. Helpful especially on tablets with limited storage. Read the rest of this post »

Watch Out Amazon-Microsoft Azure is Here to Rule

microsoft-azure-logo

With my job I end up doing some travel and with that comes the fun of meeting new people, hearing new stories, and traveling new places (possible). My last trip was to the east coast covering New York and Florida and it got me connected with some interesting folks. Waiting at the airport for our flight which eventually took off the next day (12 hours of mechanical trouble), I met some folks from cloud business’. One was a Google advocate and the other Amazon. With me being in the Microsoft world it was a perfect mix of expertise. By the time we were three drinks in, we laid down all aspects of cloud Iaas, Paas, and Saas. There was no winner (you certainly don’t want one in a bar) at the end but I like to think all of us came out wiser than before.

This made me think how little people know about Microsoft’s Cloud platform Azure. I am always amazed when people refer to Azure as only Infrastructure-as-a-service (Iaas). What many don’t know is that Azure has a wide variety of features in its Platform as a Service (PaaS) offering and Software-as-a-service. In fact Microsoft was one of two vendors described as leaders in Gartner’s application PaaS (which it calls aPaaS) Magic Quadrant. Azure as PaaS allows creation of scalable applications and services, supports multi-tier scenarios and automated deployments. As for SaaS, what would be a better example than Office 365?!

So here is my attempt to share my experience and knowledge about Microsoft Azure in the hopes that it will help you to make an informed decisions when selecting a cloud platform that best fits your business. In order to keep it engaging I will break this blog post in two parts.  The first one being more strategic and second one, tactical, involving Azure services and features.

In this Part 1 of 2 let’s start with the least common denominator among all cloud platforms – Storage.

Storage

storage-wars1

Now lately you might have experienced an influx of news around increased cloud storage and reduced costs. Storage wars between the three companies Microsoft, Google, and Amazon are at its best. Here is the latest cost analysis among these three leaders (Note: this is up to date as of this post publication).

Plans
Free(GB)
100 GB Cost
100 GB Cost
Microsoft OneDrive15 GB$1.99/month$3.99/month - 200 GB (1 TB is available with OneDrive for Business)
Google Drive15 GB$1.99/month$9.99/month - 1 TB
Amazon Cloud5 GB$4.17/month$41.67/month - 1 TB
WinnerMicrosoft & GoogleMicrosoft & GoogleN/A - similar plans do not exist in OneDrive

 

Now let’s take a peek at the enterprise storage pricing comparison. This is not comprehensive as there are too many factors in play and pricing will depend on egress, redundancy selected etc. and hence no clear winner

Plans
I TB
50 TB
100 TB
Microsoft Azure$40/month$2,000/month$4,000/month
Google Cloud Storage$26/month$1,300/month$2,600/month
Amazon AWS$30/month$1,475/month$2,926/month

 

Iaas Magic Quadrant

gartnermagic

During our talks we all agreed on one aspect – Amazon is the gorilla in the cloud business based on sheer revenue, but looking at statistics Microsoft is closing the gap at a rapid pace. In the May 2014 Gartner Magic Quadrant  even though Amazon leads the space an interesting point to note (and one which isn’t easy to digest by the tech industry) is the inroads Microsoft has been making in strategic accounts. It is quite evident that Microsoft Azure IaaS is catching Amazon AWS in terms of functionality, automation and innovation. With late adopters Microsoft has been gaining traction and even though prospects consider their existing incumbent vendors, but this quadrant proves that Microsoft is overwhelmingly the top contender in that market.

Business Continuity

With the recent acquisition of InMage (which provides continuous data protection) Microsoft took another step towards business continuity solutions in the cloud. In a recent email Satya Nadella, Microsoft’s CEO outlined his mission statement to the employees. He stated his goals and shared his vision for Microsoft in this 3,100 word email where he put forth the company as mobile first and cloud first. Staying true to his roots he invested in acquiring InMage, focusing on data protection and back up retention. Over time InMage will be rolled into the Microsoft Azure Site Recovery service to add scale to the company’s newly added disaster recovery and business continuity offering. Last month Microsoft also launched the StorSimple appliances which provides storage as a tier by integrating cloud services to on-premises storage.

Predictive Analysis

Microsoft recently launched preview of a new service called Azure Machine Learning. It is a public cloud-based service that lets developers embed predictive analytics into their applications. Think of the value this can provide you when you combine this with customer CRM and marketing systems,  generating scores predicting customer behavior. Another great new upcoming feature in Office 365 is Delve (formerly Oslo) and if you really put 2 and 2 together you’ll realize that horsepower behind Delve is nothing else but Azure Machine learning.

Security

Azure recently announced additional enhancements that honor their commitments to security and increased transparency. This includes stronger cryptography, including enhancements to the default Transport Layer Security (TLS)/Secure Socket Layer (SSL) cipher suites and enabling Perfect Forward Secrecy (PFS). PFS uses a different encryption key for every connection, making it more difficult for attackers to decrypt connections. This encryption work builds on the existing protections already in many Microsoft products and services, such as Microsoft Office 365, Skype and OneDrive. Customer content moving between customers and Microsoft will be encrypted by default. All of the key platforms, productivity and communications services will encrypt customer content as it moves between Microsoft data centers.  Data traveling between services (for e.g. one email provider to another) is protected and customer content stored in Azure is encrypted (there are also tools for developers to allow them to easily protect data).

I sincerely hope this provides you with valuable insights and enough to get you excited about Microsoft Azure. In case it does not then don’t forget to check back in a few days for Part 2 of this series which will contain information on more in-depth features of Azure like media services,  mobile devices, Azure web sites, Azure files and more.