Brian ODonnell, Author at Perficient Blogs https://blogs.perficient.com/author/bodonnell/ Expert Digital Insights Mon, 31 Mar 2014 13:30:32 +0000 en-US hourly 1 https://blogs.perficient.com/files/favicon-194x194-1-150x150.png Brian ODonnell, Author at Perficient Blogs https://blogs.perficient.com/author/bodonnell/ 32 32 30508587 What’s ahead for the Xbox One? https://blogs.perficient.com/2014/03/31/whats-ahead-for-the-xbox-one/ https://blogs.perficient.com/2014/03/31/whats-ahead-for-the-xbox-one/#respond Mon, 31 Mar 2014 13:30:32 +0000 http://blogs.perficient.com/microsoft/?p=21775

Microsoft has always marketed the Xbox One as a gaming machine that can enhance your living room.  This has played out very well for the Xbox One but recently has faltered in the light of the Sony PlayStation 4 (PS4) being a more powerful machine.   This has given customers the idea that the Xbox One is a less capable gaming machine which costs $100 more than the PS4.  The last week and a half brought a couple news items to leave gamers and Xbox One owners scratching their heads.
XBox OneDirectX 12 revealed
Last week at the Game Developers Conference (GDC) Microsoft announced a new version of Direct X (Direct X 12), their 3D graphics framework.  The focus of Direct X 12 is to create a more efficient API which allows developers to get closer to the hardware than ever before.  This essentially allows developers to target a variety of different hardware variations with a single API; distributing the workload across multiple cores and squeezing every last drop of power out of the respective hardware.  This approach is very similar to what happens when developing on a console.  The main difference is consoles have a single hardware configuration where PC’s can have a nearly infinite configuration.  
Direct X 12 being introduced in the very beginning of the new console generation is extremely beneficial for Microsoft and the Xbox.  Essentially any benchmark differential the Xbox One has with the PS4 can potentially be eliminated through software if game developers are using Direct X 12.  This is also great news for PC gamers.  The Xbox One essentially runs Windows 8.1; a game written for the Xbox using Direct X 12 can be ported to the PC version of Windows with few coding changes.  When Microsoft completes the API consolidation across the Windows platform in 2015, deploying to Xbox and any other Windows device will be as simple as selecting your target and press ‘Build’.
Facebook buys Oculus Rift for 2 billion dollars
Oculus Rift got its initial investment from a Kickstarter campaign and has become the leader in virtual reality technology.  At GDC Oculus revealed the second iteration of its developer kit along with initial plans to begin selling their virtual reality headset to consumers later this year.  This consumer device works with PC’s and developers can specifically target the device in order to create compelling virtual reality environments for their games.   Sony also revealed their own virtual reality technology to compete with Oculus.  Sony’s virtual reality headset, Project Morpheus,  is positioned to be an add-on for the PS4.  Strangely, Microsoft was not involved in any discussions revolving around virtual reality technology.  In a shocking move shortly after GDC, Facebook announced it would be purchasing Oculus Rift for approximately $2 billion.
One has to ask, where was Microsoft during this?  They were quick to release the Kinect after Nintendo popularized motion based gaming but have remained strangely quiet in regards to virtual reality.  Do they have a plan regarding virtual reality technology for the Xbox One or do they see it as a passing fad?  Microsoft may indeed have some Xbox news at next week’s Build conference as the Xbox One OS is due for a large update.

]]>
https://blogs.perficient.com/2014/03/31/whats-ahead-for-the-xbox-one/feed/ 0 224624
Busy Pre-Build week for Microsoft and Azure! https://blogs.perficient.com/2014/03/28/busy-pre-build-week-for-microsoft-and-azure/ https://blogs.perficient.com/2014/03/28/busy-pre-build-week-for-microsoft-and-azure/#respond Fri, 28 Mar 2014 20:36:22 +0000 http://blogs.perficient.com/microsoft/?p=21750

The Microsoft Build Conference is set to kick off next week but the company got off to an early start this week with several different announcements.
Windows Azure now generally available in China
This may not sound like a huge accomplishment worthy of being called out individually but a little known fact is that Windows Azure is the first major public cloud service that China has made available.  This opens Azure up to an enormous user base that cloud competitors Google and Amazon don’t yet have access to.
Windows Azure will soon be re-branded Microsoft Azure
In an effort to strengthen the Azure brand, Microsoft is removing “Windows” from the name.  This is the help emphasize that the Azure platform is completely open and a variety of technologies can utilize it, not just Microsoft and Windows based technology.  The name “Windows Azure” has been a source of confusion since its introduction.  People who are new to cloud computing often did not know if only technologies supported by Windows were designed to work on the Azure platform.  This name change should clear up any lingering confusion.
Office for iPad debuts along with Enterprise Mobility Suite 
On Thursday Microsoft announced a fully functional, touch friendly edition of their Office suite tailored for iPads.  This has been a long time coming as millions of iPad users have had to find other methods of editing documents on their tablets.  The entire Office suite is free to download and use to view documents and presentations.  In order to edit documents an Office 365 subscription is needed, priced at $99 a year.  This subscription also provides you with desktop versions of Office 2013 as well as an Exchange Online account.
The Enterprise Mobile Suite is aimed to bring Single Sign On to all users for a variety of devices across services.  This would allow an Android tablet, iPad or Windows 8 machine using Azure Active Directory to authenticate against Office 365, Dynamics CRM and Windows Intune  as well as a variety of already available third party products.  This allows Microsoft technologies to be at the very core of the Enterprise Cloud while allowing users to “Bring Your Own Device”.
Microsoft is sure to provide more insight into this strategy next week at the Build Conference, in addition to their future road map for Windows!

]]>
https://blogs.perficient.com/2014/03/28/busy-pre-build-week-for-microsoft-and-azure/feed/ 0 224622
Strengthen Company Culture with Yammer enhanced by HDInsight https://blogs.perficient.com/2014/03/19/strengthen-company-culture-with-yammer-enhanced-by-hdinsight/ https://blogs.perficient.com/2014/03/19/strengthen-company-culture-with-yammer-enhanced-by-hdinsight/#respond Wed, 19 Mar 2014 14:30:08 +0000 http://blogs.perficient.com/microsoft/?p=21661

In a world of broadband internet connections, online collaboration tools and the ability to work from almost anywhere – office culture can be difficult to sustain.  This especially holds true for people who live in large cities (where the commute can be problematic) or in harsh climates (like the never ending winter in Chicago this year).   Yammer can help by creating remote social interactions.
Strengthen Company Culture with Yammer enhanced by HDInsightYammer is an enterprise social network that aims to connect people in the office.  A few of its features are instant messaging, user profiles, a primary news-feed, interest groups, recommendations for people to follow, groups to join as well and a recent activity feed.  The interface is clean and well designed.  One of the great things is that once you start using Yammer it is really easy to continue.
There is one area where Yammer seems to fall short.  There is no clear way to bring people together who have common interests.  The other users and groups that are recommended to me by Yammer are made based on the groups I am a part of and people I follow.  It does not take into consideration any of the data in my user profile.
Perficient recently held a hack-a-thon where my team identified this short coming.  Social interaction via online collaboration tools wasn’t cutting it.  In an online culture how can we leverage all of our tools to help facilitate more meaningful social gatherings?  The answer was to use interest data that co-workers have provided through Yammer to generate meaningful recommendations.  A Yammer profile consists of many different “interest groups”.  It lists categories such as Expertise, Interests, Previous Company and Schools Attended.  All of these can be classified as conversation topics and can be used as a common social interest.
This is where HDInsight powered by Hadoop and Mahout can help.  Mahout can consume massive quantities of information and return logical connections represented within the data.  For additional reading about Hadoop and Mahout click here.
Using an HDInsight Hadoop cluster in coordination with the Mahout recommendation engine we could provide meaningful recommendations to users based on their individual interests.  This wouldn’t just recommend topics that a user might be interested in but also groups they could create or join with other users based on their mutual interests – similar to the recommendations Facebook suggests regarding people you may know, groups to join or pages you may like.
Creating these logical, online groups would “connect the dots” to uncover a similarity between people where it might otherwise remain hidden.  It could also help facilitate in-person group outings, social gatherings or simply more friends and comraderie in the office.  Through this you are creating a more meaningful environment aided by technology.
A thriving office culture can stand out in a world where telecommuting tends to be more convenient.  This may not convince everyone to come to the office. However, instead of viewing it as obligatory, implementing a solution like this can encourage more people to choose to commute to the office for the social comraderie.  All of this can be done for free through the Yammer API and a Windows Azure account.

]]>
https://blogs.perficient.com/2014/03/19/strengthen-company-culture-with-yammer-enhanced-by-hdinsight/feed/ 0 224613
Windows Azure: Retiring Windows Server 2008 and how to upgrade https://blogs.perficient.com/2014/03/12/windows-azure-retiring-windows-server-2008-and-how-to-upgrade/ https://blogs.perficient.com/2014/03/12/windows-azure-retiring-windows-server-2008-and-how-to-upgrade/#respond Wed, 12 Mar 2014 14:30:32 +0000 http://blogs.perficient.com/microsoft/?p=21523

Beginning on June 2, 2014 Windows Azure will be retiring Windows Server 2008.  This means that you will no longer be able to deploy a new Cloud Service or manage your existing services on virtual machines running Windows Server 2008.
Windows Azure: Retiring Windows Server and how to UpgradeWindows Azure currently supports four different GuestOS ‘versions’:

  • GuestOS 1.x – Windows Server 2008
  • GuestOS 2.x – Windows Server 2008 R2
  • GuestOS 3.x – Windows Server 2012
  • GuestOS 4.x – Windows Server 2012 R2

If your Cloud Service has not been upgraded and is still running on Windows Server 2008 you must upgrade the servers that power your service.  How do you do that?  Isn’t the point of a running a PaaS cloud service instead of using IaaS to handle the operating system and hardware for me?  The short answer is yes, but…
PaaS will take care of much of the hardware, IIS patches and OS patches for you but Azure will not do entire OS upgrades for your entire service unless you tell it to.  This happens because incompatibilities between cloud services and operating systems are likely to arise.  This would cause developers to try and fix code on the fly.  That is not only bad for up time but could also come with some very serious security holes.
Thankfully, living in a world where you have to manually upgrade the server OS for your service is in the past.  Azure makes it easy to upgrade the guest OS for your service.  You can even have your production service remain on Windows Server 2008 while upgrading your staging environment and deploying your service there.  This will allow developers to fix any outstanding bugs that are introduced with the operating system upgrade.
How do you upgrade your staging environment?  It is pretty straight forward.  From the cloud service dashboard select your staging environment and choose Configure.  At the bottom of the page find the operating system section.  You will see drop down menus for OS Family and OS Version.  Select proper OS Family (in this case anything but 1.x) and OS Version.  To always have the most up to date OS Version select automatic.  This ensures your cloud service will always be running on the latest Azure VM that is available.  If you do not want this select a static version of an OS.  This guarantees that your cloud service will remain running this OS until you upgrade it in the future.
When the service is cleared for production it is time to configure your production environment.  Upgrading your production environment can lead to some downtime for your service, but there is a way to avoid it.  Normally you will need to configure your staging and production environment independently but now you can swap your staging and production environments using the Swap option in the portal.  This will effectively swap your staging environment into production.  The change will happen within seconds and any downtime experienced will be minimal.
After the swap you can rebuild and configure the former production environment, which is now your staging environment to match that of your current production environment.

]]>
https://blogs.perficient.com/2014/03/12/windows-azure-retiring-windows-server-2008-and-how-to-upgrade/feed/ 0 224605
What is Net Neutrality and why is it important? https://blogs.perficient.com/2014/03/10/the-battle-for-net-neutrality-and-why-it-should-matter-to-you/ https://blogs.perficient.com/2014/03/10/the-battle-for-net-neutrality-and-why-it-should-matter-to-you/#respond Mon, 10 Mar 2014 14:30:02 +0000 http://blogs.perficient.com/microsoft/?p=21529

The idea behind Net Neutrality has been a topic of increasing scrutiny recently.  There are groups of people who are very out spoken about the importance of Net Neutrality and deregulation of the Internet.  For the majority of internet users the topic falls on deaf ears.  This is generally because Net Neutrality is attempting to protect the Internet against regulation and monetization that has not happened yet, but could happen and some may argue has started happening.  It is difficult to explain Net Neutrality and what can happen if the Internet was heavily regulated because the explanation has no historical evidence.
What is Net Neutrality and Why is it Important?The fight for Net Neutrality means the fight for the open and unregulated Internet.  Supporters of Net Neutrality want the Internet to remain completely open, belonging to no-one.  Access to the Internet should be hassle free, available to everyone,uncensored and extremely competitive where possible.  If this sounds familiar it is because until 2014 this is how the Internet has operated (for the most part in the United States).  Heavy Internet regulation has never happened before so its hard to point a finger at an example and state with conviction, “this will happen”.  What I can tell you is what is happening and why Net Neutrality is important.
Typically this is how the Internet works; a customer will subscribe to an Internet Service Provider (ISP) for a connection to the Internet.  That customer will also subscribe to a variety of services available to them through their internet connection.  I may have a Netflix subscription, stream or download music, play games online, watch YouTube videos and connect to an encrypted VPN connection.  These services are either free or I pay for a subscription to a service which is delivered to me through my connection to the Internet provided by my ISP, which I also pay for.  You expect these services to be available to you regardless of what ISP you subscribe to.  You also expect the service quality to be in line with how fast of an internet connection you have and how fast the service can be delivered.  For example if I have an internet connection through Comcast @ 25Mbps downstream I would expect the quality of Netflix videos to be the same if I was on an internet connection through AT&T @ 25Mbps downstream.  Assumptions like this are common sense.  As a customer you expect to have access to the entire Internet at the speed you pay for.  What if that wasn’t the case?
Imagine a world where you must choose an internet provider based on the different services you consume.  A world where I choose an ISP because the services I subscribe to are delivered in better quality.   For example, on Comcast perhaps I can stream Netflix at full high definition quality but on AT&T I can only stream Netflix at standard definition even though my internet connection on either service is 25Mbps downstream.  To take this a step further; imagine a world where I have to look at what internet services are available to me through that subscription.  I have to make sure services like Netflix, iTunes and Spotify are included in my subscription with the 25Mbps downstream I am paying for.  Then notice that in order to stream HBOGo I have to pay an extra monthly fee to have that service included.  Previously I had written about how Windows Azure delivers the Winter Olympics online.  The extra service fee argument can be made to stream special events like the Olympics, World Cup or Superbowl.  This sounds a lot like another cable subscription model we are all familiar with but in this case I am paying for premium services instead of premium channels.  This could be very problematic for a world that is starting to become comfortable and reliant on cloud computing services.
This imaginary world is already starting to take shape in reality.  Netflix, who has been a strong Net Neutrality advocate, is the largest generator of traffic on the Internet.  Some ISP’s have demanded Netflix pay extra fee’s due to the amount of traffic the service generates.  Netflix has repeatedly refused to pay stating that it violated Net Neutrality.  The service has been plagued with performance degradation ever since.  With the FCC losing the battle for Net Neutrality, Netflix has changed its mind and entered into an agreement with Comcast, the nations largest ISP, which sets a dangerous precedent.  Comcast will now stream Netflix videos faster than competitors and will undoubtedly be a selling point for customers.  Suddenly quality of service is no longer dependent on your internet connection speed but your provider.  Essentially the first “premium service” has been created even though the cost has not been passed to consumers.
Prioritizing specific content is a direct violation of Net Neutrality and that is precisely what has happened with Netflix.  Based on this outcome it can be argued that any internet service will be treated the same way if they generate large amounts of traffic.  It does not matter if the service runs on Amazon Web Services, Windows Azure or Google AppEngine; if it is popular and generates enough traffic be prepared to pay for delivery.
Net Neutrality is important because breaking it can fundamentally change how the Internet works.  The more Net Neutrality is violated the harder it might be to re-establish.  The Internet is arguably the greatest tool ever made by man.  It has worked the way it was designed since its inception.  For the first time its fundamental design and the way it works is being challenged and that should be a very scary thing for everyone not just power users.
Click here for additional reading on Net Neutrality.

]]>
https://blogs.perficient.com/2014/03/10/the-battle-for-net-neutrality-and-why-it-should-matter-to-you/feed/ 0 224606
Windows Azure: How to create a streaming media service (GUI) https://blogs.perficient.com/2014/02/17/windows-azure-how-to-create-a-streaming-media-service-gui/ https://blogs.perficient.com/2014/02/17/windows-azure-how-to-create-a-streaming-media-service-gui/#respond Mon, 17 Feb 2014 15:30:07 +0000 http://blogs.perficient.com/microsoft/?p=21098

In my previous post I discussed how Microsoft and NBC were streaming every single event live and on demand at the Sochi Olympic Games.  Azure makes publishing and streaming videos easier than ever before.  This post will walk you through creating a media service, uploading content (video or audio), encoding it and publishing it for consumption.  We will do this all using the Azure management portal.

  • To start log into Azure and go to the portal (if you don’t have an Azure account you can get one for free at http://windows.azure.com).  Select ‘Media Services’ on the left hand navigation bar, then select ‘New’. NewMediaService
  • Fill in the requested information.  Note that in order to create your Media Service.  You must use and existing storage account or create a new one.  In this example we will be creating a new storage account.  This storage account will hold all of the media that we would like to stream from Azure.
    NewMediaService2
  • After you Media Service is created your dashboard should look similar to this.  Our next step is to upload some content.  Click on the ‘Upload’ button.
    NewMediaService3
  • You can select content to stream in two ways.  You can upload content you have stored locally on your computer, or you can “upload” content to Media Services that is already located in Azure blob storage.  The content can be located in any storage account you have access to in Azure.

  • NewMediaService4
  • Once the media is uploaded we will be able to publish it or encode it.
    NewMediaService5
  • Select ‘Encode’.  This will provide a small modal as well as list the various options Azure provides for video encoding.  For this example we will stick with the Common presets.  In a later example I will show you how to enable IIS Smooth Streaming content.  Smooth Streaming is a new adaptive bit rate technology which operates over the HTTP protocol.  Smooth Streaming is what is being used for the Olympic live and on demand video streams.  Select ‘Playback via HTML 5’ and click the check mark to begin the encoding process.
    NewMediaService6
  • Depending on the size of the video file uploaded the encoding may take a few minutes.  If you were uploading and encoding videos in bulk Azure will automatically create an encoding queue for you.  The default Media Service setting enables just one encoding process.  You can increase that so that 5 encoders are running at once.  This does not mean the amount of time the video encoding will decrease; it means that you can encode up to five files at once.  To do this click on the ‘Encoding’ tab and move the slider up.  Note that this will incur an extra charge based on time used.
    NewMediaService8
  • Once the encoding is complete click on the ‘Content’ tab and select the newly encoded item.  Select ‘Publish’ to make the video file accessible on the web.
    NewMediaService9
  • Copy the hyperlink in the ‘Published URL’ column.  Open a new browser window and paste the URL.  Your HTML5 video should being playing in the browser with a full featured player.  You can even do full screen.  If your internet speed can’t keep up with the quality of the encoded video you can create as many different encoded versions of the video as you would like.  Simply repeat the above encoding process.  NewMediaService10NewMediaService11
    This was a very basic introduction to how easy it is to use Azure Media Services.  Next time we will automate this process by using the Azure Media Services SDK to upload and encode content.  We will also stream and deliver the content programmatically from an existing cloud service.  
]]>
https://blogs.perficient.com/2014/02/17/windows-azure-how-to-create-a-streaming-media-service-gui/feed/ 0 224574
How Windows Azure delivers the Olympics https://blogs.perficient.com/2014/02/10/how-windows-azure-delivers-the-olympics/ https://blogs.perficient.com/2014/02/10/how-windows-azure-delivers-the-olympics/#respond Mon, 10 Feb 2014 15:30:32 +0000 http://blogs.perficient.com/microsoft/?p=20990

NBC and Microsoft recently publicized they are streaming every event of the 2014 Winter Olympics to any iOS, Android, Windows device using Windows Azure Media Services.  What is Windows Azure Media Services (WAMS) and how does it work?
WAMS is a cloud optimized edition of the Microsoft Media Platform (MMP) which handles a variety of tasks such as format conversion, media encryption, analytics with on-demand and live streaming capabilities.  Microsoft Media Platform is traditionally confined to a server farm but by leveraging Windows Azure WAMS has nearly limitless compute and streaming capabilities.
How Windows Azure Delivers the OlympicsWhen considering infrastructure it is important to consider which configuration we will be using.  There are two options to consider Infrastructure as a Service and Platform as a Service.

  • Infrastructure as a Service (IaaS).  Using this method we must setup and configure virtual machines (VM) to connect to our WAMS setup.  To utilize IaaS auto-scaling we must create additional VM’s to handle requests when demand is high.  This means we must forecast an approximate number of active streaming requests, create the right amount of VM’s to handle the requests and turn on the auto-scale feature to utilize the dormant, yet pre-configured VM’s.
  • Platform as a Service (PaaS).  Using PaaS there is no extensive VM configuration.  After deploying your cloud service and configuring IIS once you can now depend on Azure to auto-scale your cloud service automatically without having to configure additional VM’s for a “just incase” scenario.  There will be no need to forecast the number of concurrent requests at any given time.  As long as IIS is setup to provide on-demand and live streaming media correctly once then it is setup for your cloud application no matter how great the demand.  Essentially by giving up some control in configuration we can save a lot of work.  This is the method most likely being utilized to deliver the Olympics.

The setup for live streaming and on-demand will slightly differ in how they are captured and consumed by the public.

  • The live streaming setup involves the footage being captured, encoded and then sent to web roles in Azure (typically referred to as an “ingest” server).  This can work with a single web role but for redundancy additional web roles can be used.  The additional  web roles can consume the data as long as they are at a different DNS address.  In this situation multiple web roles are probably used for world wide redundancy.  As the data is being pushed to the cloud content delivery web roles begin to pull the data and push it to the requesting parties.
  • On Demand streaming does not require the high speed capture and encoding of live footage but does require an enormous amount of storage capacity.  Every event during the Olympics will be available for on demand streaming, which means every even must be captured and stored in Azure blob storage.  Every event is being captured in full HD (1920 x 1080 resolution).  You can imagine this will amount to a substantial amount of data, probably several terabytes. While the live streaming web roles need to pull the streaming encoded content, the on demand web roles need to the stream the media files.  Sending a full HD stream to a device such as a cell phone with limited bandwidth is not the most efficient distribution process so Azure utilizes a technology called Smooth Stream.

Smooth Stream is a dynamic content delivery technology that will adapt the stream that is sent to the requester based on their bandwidth.  It is being utilized for both on-demand and live streaming events.  In order to deliver content at a consistent frame rate free of lag or pixelation the video is broken up into small fragments.  As the fragments are delivered and played, the time it took to play the fragment as well as the playback quality will be sent back to Azure.  If the quality or playback time does not meet standards set on the server then the next fragment will be sent at a lower quality and the process will repeat.  If bandwidth increases during the next fragment a higher quality version of the next fragment will be sent.  As you can imagine this means every Olympic event needs to be stored in full HD and in several tiers of lower quality fragments to deliver content to every type of device over any kind of bandwidth.
The Olympics is no doubt one of the most watched events of the year.  By utilizing dozens of Azure data centers capturing, replicating and delivering content all over the world Microsoft is once again showing the power of what can be accomplished using Windows Azure.  Microsoft began streaming the Olympics in 2008 and since has quietly become a media streaming powerhouse with the ability to deliver content to millions at a moments notice.

]]>
https://blogs.perficient.com/2014/02/10/how-windows-azure-delivers-the-olympics/feed/ 0 224566
New Microsoft CEO and the ripple effects https://blogs.perficient.com/2014/02/04/new-microsoft-ceo-and-the-ripple-effects/ https://blogs.perficient.com/2014/02/04/new-microsoft-ceo-and-the-ripple-effects/#comments Tue, 04 Feb 2014 15:47:13 +0000 http://blogs.perficient.com/microsoft/?p=20919

This morning Microsoft announced that Satya Nadella will be taking over as the new CEO.  Satya Nadella is the former head of the Cloud and Enterprise division at Microsoft.  The last few years Microsoft has been focused on providing a better cloud solution.  They have rapidly iterated on Windows Azure, SharePoint Online, Office 365, Xbox Live and the latest, Visual Studio Online.  All of these services did not exist 3-4 years ago!  This is in addition to former-CEO Steve Ballmer stating that Microsoft was transitioning to a Devices and Services company.  The hiring of a cloud computing expert as the next CEO boldly states where Microsoft is placing its bets.  Even more telling is that Satya Nadella’s tenure as CEO is effective immediately; meaning Steve Ballmer is going into early retirement.  He was not scheduled to step down until August 2014.
New Microsoft CEO and the Ripple EffectsWhat about the consumer?  Building a first class cloud service is great but you need people to use the services provided.  You can argue the enterprise business will provide users.  That is true to an extent.  The enterprise is becoming more ‘Bring your own device (BYOD)’ friendly, meaning Microsoft still needs the consumer to purchase their devices.  They can not flourish on the enterprise alone.   Nadella does not possess a strong portfolio for the consumer space.  He is a cloud expert which puts him on the back-end of product design.  His cloud expertise empowers those devices but does not put them in consumers hands.
Enter Bill Gates.  Bill Gates was asked by Nadella to take a larger role at Microsoft as Technology Adviser.  Gates also stepped down as Chairman of the Microsoft Board Satya Nadella Quoteimmediately in order to submerge himself in the new role.  Some may argue that Gates is also not a product expert.  While that may be true in some respects, his tenure leading Microsoft was anything but a failure.  Gates has also had the unique point of view of sitting idly on the sidelines during times when technology was influencing culture the most.  The rise of the iPod put gigabytes of music in everyone’s pocket.  Microsoft’s attempt at a competitor with the Zune were unsuccessful.  Windows Mobile was once a prominent figure in the mobile computing space; then the iPhone and Android transformed the industry.  Microsoft had to play catch up (and still is) in the fastest growing technology market in history.  Observing the company he founded lose out to its direct competitors will provide ample motivation to recapture consumer excitement.
The partnership of Gates and Nadella is exciting for many reasons.  It brings the future aspirations of the company (Cloud/Nadella) together with the founder who build it up into a global power (Gates).  It will certainly provide a boost for Microsoft and reaffirms it’s goal of becoming a Device and Services company.  Now they have people in place, leading the company, who can be considered experts in each of those facets.

]]>
https://blogs.perficient.com/2014/02/04/new-microsoft-ceo-and-the-ripple-effects/feed/ 2 224560
Windows Azure: PaaS changing the landscape of online gaming https://blogs.perficient.com/2014/01/28/windows-azure-paas-changing-the-landscape-of-online-gaming/ https://blogs.perficient.com/2014/01/28/windows-azure-paas-changing-the-landscape-of-online-gaming/#comments Tue, 28 Jan 2014 15:15:54 +0000 http://blogs.perficient.com/microsoft/?p=20790

Titanfall is a new blockbuster game for the Xbox One.  It is being published by Electronic Arts and is due to be released in March 2014.  Titanfall is a first person shooter that will have much of its AI hosting , physics calculations, online match making and multi-player dedicated servers hosted in Windows Azure.  This means several things:

  1. Azure’s IaaS provides dedicated servers for multi-player games providing near infinite bandwidth with low server pings and anti-cheat enabledWindows Azure: PaaS Changing the Landscape of Online Gaming
  2. Azure’s PaaS is being utilized to provide physics calculations and specialized AI to learn your style of play
  3. PaaS and dedicated servers auto scale to provide fast dynamic content to players around the world on a consistent scale

Multi-player infrastructure background

Traditionally multi-player games have been played using a client/server paradigm.  This paradigm generally involves a computer acting a dedicated server for the game.  This dedicated server accepts connections from a specific amount of players and handles communication between the clients/players.  The server normally does not perform any game relevant calculations but would act as a central repository where players send update information which would then be distributed and consumed by every client.
Recently the game development community has moved away from the dedicated server model due to operational cost and replaced it with a player-host model.  The player-host model essentially means that one player hosts the game and every other player connects to the host.  This new paradigm has several disadvantages to network multi-player gaming but was implemented to save costs on running dedicated servers as game hosts.  A few of the obvious disadvantages to the player-host model are:

  1. Inconsistent bandwidth and server lag of the player chosen to be the host
  2. No anti-cheat enabled on host
  3. Slower updates / increased lag due to server not being dedicated
  4. Local player receives faster updates than other players

How Azure fixes this

The dependence on a cloud infrastructure for a fast paced reactionary game is a significant leap of faith.  Video games generally run in a continuous loop created by the game engine to repeatedly update all of the game data (AI, particles, physics, player movement, event handling etc.) and then draw that data to the screen.  It takes a  substantial amount of CPU and GPU power to calculate and render all of the in-game objects at speeds necessary to achieve the target of 60 frames per second.
The developer of Titanfall, Respawn Entertainment, is utilizing Azure PaaS to handle several expensive calculations normally performed by the local host (console or PC).  These calculations are typically done on the local host so the player experiences minimal lag.  With these calculations off loaded to the cloud and not affecting any game play, it allows the developers to optimize the Xbox One hardware to handle more graphically intense environments.  This strategy could also extend the life of the Xbox One even further in the future.
Cloud computing services such as Azure have allowed dedicated servers to once again be economical.  With automatic server scaling and incredibly cheap virtual machine costs, the server cost and total hours of man power have been significantly reduced.  The more calculations that are performed in the cloud the more you can do with the hardware available.  Another way to look at this is, the more calculations you can do in the cloud significantly impacts the entry point for other hardware platforms.  If a developer is able to process 90% of intense calculations on an Azure compute cluster then the hardware needed to play the game can be anything from a tablet to a workstation.  This has the opportunity to increase the install base substantially.
Games are real time applications that depend on milliseconds and timing.  Azure is effectively performing calculations for a real time application and delivering results to multiple parties simultaneously.  If the Titanfall launch performs well, expect hundreds of future Xbox One games to utilize Windows Azure in making the cloud (and Azure) a dominant force in multi-player gaming for years to come.

]]>
https://blogs.perficient.com/2014/01/28/windows-azure-paas-changing-the-landscape-of-online-gaming/feed/ 1 224550
Coin — One card to rule them all? https://blogs.perficient.com/2014/01/20/coin-one-card-to-rule-them-all/ https://blogs.perficient.com/2014/01/20/coin-one-card-to-rule-them-all/#comments Mon, 20 Jan 2014 15:30:44 +0000 http://blogs.perficient.com/microsoft/?p=20752

What is Coin?  Coin is a brilliant new technology that allows users to consolidate all of their cards into a single Coin card.  A Coin card is not your traditional credit card.  It is an electronic device the size of a credit card with a programmable magnetic strip.  Any card with a magnetic strip whether that be a debit/credit card, gift card or preferred customer card, can be put on your Coin card.
The Coin card works over Bluetooth and is paired with your phone.  Using your phone and an adapter supplied by Coin, a user swipes their cards which gets loaded into your Coin account.  When a debit card is neededCoin Credit Card instead of a credit card, make the selection on your phone.  The Coin app will send the information to your card and it will be ready for use with that specific card information.  Loose your phone or your card?  Have your wallet stolen?  That is OK.  Coin has security configurations that will deactivate the card automatically if it loses communication with your phone for too long.  It sounds as if Coin has thought a lot about security, at least from the physical security point of view.  What about digitally?
We live in a world where data breach is common.  A new story about a large company being hacked with customer information stolen seems to happen semi-regularly.  Many times the stolen data is not encrypted and this non-encrypted data contains anything from credit card information to email addresses.  Is it safe to put all of my banking, credit and preferred customer information in a single location?  It is a risky move to digitally putting all your eggs in one basket.  If Coin was hacked and your data was stolen what would happen?  It is essentially the same thing as having your entire wallet stolen.
Coin appears to be prepared for this.  Coin does not state what user data is stored with them but they do state all user data in the cloud, on the mobile app or on the card itself  is encrypted using at least 128-bit encryption.  In addition any information transferred via Bluetooth is also encrypted so personal data could not be used if it were captured during transmission.  This means that if the data is stolen from the cloud, phone or card it is virtually worthless without the decryption key.
Coin has put the right foot forward in their vision of plastic card consolidation.  The strong encryption shows they are serious about data security.  With the configurable lockout and deactivation features they are making every effort to physically secure the device from theft or being lost.  The technology being used is not new but the way it is being used is both new and unique.  If Coin is as secure as they claim and the concept takes off expect the popularity to grow exponentially along with the copy cats.  The card itself is still in pre-order and is set to be released this summer.  You can find out more about Coin here.

]]>
https://blogs.perficient.com/2014/01/20/coin-one-card-to-rule-them-all/feed/ 2 224548
Windows Azure: What is Platform as a Service? https://blogs.perficient.com/2013/12/05/windows-azure-what-is-platform-as-a-service/ https://blogs.perficient.com/2013/12/05/windows-azure-what-is-platform-as-a-service/#respond Thu, 05 Dec 2013 15:00:48 +0000 http://blogs.perficient.com/microsoft/?p=20384

What is Platform as a Service (PaaS)?  How does it differ from Infrastructure as a Service (Iaas)?
Let’s start with IaaS.  When “The Cloud” first became popular IaaS was the target.  The point of IaaS is to migrate a company data center into Windows Azure.  This involves converting whatever physical servers you have to Hyper-V virtual machines and upload the contents to Azure(or sending them via FedEx for Microsoft to upload).  While this process is cumbersome and time consuming it does work and has some great benefits.  The next step in this process is to create a secured link between your former data center and the Azure data center.  Doing this will allow your users connectivity to all of their pre-existing applications.  In fact, your users should not even notice that the data center has been moved.  Everything should operate exactly how it did prior to the migration.  Depending on the size of your infrastructure IaaS has the ability to save you time, space, money and will allow IT admins to breathe easy knowing they don’t have to be overly concerned with server hardware failure.  From that perspective, IaaS saves you money because you have zero server maintenance / replacement cost.  Another benefit of IaaS is that all of your existing third party software is guaranteed to work on IaaS.  If a business has an old Citrix application that is critical for the accounting team, then migrating to IaaS will guarantee that application still works as expected.  The problem with IaaS is that you are still running an entire infrastructure.  IT will still have to manage the servers, Active Directory, patches, and updates.  In short, you aren’t benefiting from a lot of the major benefits of the cloud.
PaaS aims to solve that.
PAASPaaS in Azure is synonymous with “Cloud Service”.  The target of PaaS is custom business applications that will replace your existing third party software  infrastructure.  These applications can be written in either .NET or Java.  In PaaS you essentially rent the hardware the application runs on.  Renting the hardware means you do not have administrative access to the virtual machines powering the application.  The permissions are set so that Remote Desktop to the virtual machine is enabled and users can administer IIS.
What is the advantage of renting hardware the application runs on?   Why wouldn’t I simply use IaaS to retain tighter control over the environment?
Azure manages the infrastructure powering the application so IT administrators no longer have to worry about it.  The infrastructure used to power a global application with millions of users generating massive amounts of traffic (like SnapChat)  is massive.  The work and time it takes to manage that infrastructure is several times larger.  With PaaS you are able to eliminate the management of the environment the app runs on.  It is also cheaper to run an application with PaaS than creating dozens upon dozens of virtual machines.  Scaling is much easier with PaaS.  It is possible to scale to a massive size very quickly(and automatically) with PaaS.  To do the same with IaaS, virtual machines must be created in geographically relevant locations.
PaaS provides most of the flexibility that IaaS provides.  By sacrificing a little control administrators and developers are able to automate most of the infrastructure maintenance that comes with growing web applications.  This not only saves money but allows developers to create new features without having to worry if the infrastructure can handle it.  Whether the application is of massive global scale like SnapChat or simply a small application which applies business logic to users; PaaS on Azure maximizes productivity by removing laborious infrastructure maintenance.

]]>
https://blogs.perficient.com/2013/12/05/windows-azure-what-is-platform-as-a-service/feed/ 0 224525
Windows Azure: Web Roles vs. Websites https://blogs.perficient.com/2013/12/04/windows-azure-web-roles-vs-websites/ https://blogs.perficient.com/2013/12/04/windows-azure-web-roles-vs-websites/#comments Wed, 04 Dec 2013 15:00:42 +0000 http://blogs.perficient.com/microsoft/?p=20266

Windows Azure has many different devices for publishing and consuming content.  Two of those which are often confused are web roles and web sites.  The two are very similar and share some common strengths, but given specific conditions one may fit your need better than the other.  For example, web roles and web sites will both support auto scaling, database support, blob storage access,  ASP.NET, NodeJS, Python and PHP.   Here is a brief overview of specific uses for each.

Web sites:

Web sites on Windows Azure operate very similar to those run by other web hosting companies but with added Azure benefits.  You can access the site you created via FTP or Git which means changes you make to the code are updated instantaneously upon submission.  If you need to use any of the templates that are provided  by Windows Azure then web sites are the way to go.  Azure has a slew of templates that you can install, configure and update from the gallery.  Everything that you are familiar with in traditional web sites applies to Windows Azure web sites.
Web roles are where the power of the Windows Azure platform begin to shine.  Here are some of my favorite advantages of the web role cloud service.

cloudWeb role:

A Web role is a cloud service.  Cloud services are really where Platform as a Service ideology really begins to work for you.  Web roles allow developers much more control over the environment.  They are created within virtual machines and allow developers direct access to IIS (or web server of their choosing).  Because web roles run in virtual machines they also give you the flexibility that virtual machines give you, without having to manage the virtual environment.  What does that mean?  With a virtual environment you have to stay up to date on patches for the operating system and web server along with any other maintenance the machine will need.  With a web role that is not necessary.  While you still have remote access the to virtual machine your web role is running on, you do not need to manage it.  The virtual environments and patches are managed by Azure.
Web roles can also be attached to one or many worker roles.  Think of a worker role as a console application that will run a computationally heavy process that may take a considerable amount of time.  These types of processes are not for immediate user consumption but to be calculated and stored for future access.  For example, generating recommendations for users in a way similar to Netflix or Amazon is computationally expensive.  To do that within a GUI front end like a web site or web role is not practical.  Unlike a web site, a web role can call a worker role to run a background process.  This worker role can run on a different virtual machine so as not to ruin the performance of your web role.  In addition you can set up Windows Azure Virtual Network so the web and worker roles are on the same subnet.  This allows the two to talk directly to one another on the inside of firewalls instead of having to venture out of the firewall and routed through the internet.
There are many more advantages to using Web Roles over Websites but the last few I want to touch of briefly are multiple staging environments, Content Delivery Network (CDN) connectivity, support for unsupported platforms, and running scripts with elevated privileges.  Websites do not support CDN’s and can’t run scripts that require administrative permissions.  So if a CDN is something your website requires or if you are migrating to Azure and have a few legacy cgi scripts that require admin rights websites will not work for you.
From a development standpoint multiple staging environments is a big win for web roles.  When a cloud service is set up, a staging and production environment are created for you.  There is no longer the fear of having working code in your test environment and crashing your production environment.  The environments are identical in Azure.  If it works in staging it will work in production.
Web Roles and Websites also scale out differently.  Websites only need to worry about scaling up your site in order to keep up with demand.  In a Web role the scaling is similar but what if your web application needs two worker roles for each web role to handle all of the computation.  Scaling with Web roles can facilitate this type of configuration.
I have just pointed out a few of the practical differences between websites and web roles.  Websites are perfectly suitable for publishing content fast, accessing databases and processing records.  They are great for content management systems and you can even use them for e-commerce .  If the cloud service you build will need multiple virtual machines in addition to a virtual network where information will be passed inside the firewall or any significant back end, then a web role / worker role setup is most likely for you.  It provides you the flexibility you need without having to manage the infrastructure you are creating.

]]>
https://blogs.perficient.com/2013/12/04/windows-azure-web-roles-vs-websites/feed/ 1 224514