Microsoft

Blog Categories

Subscribe to RSS feed

Archives

Archive for the ‘Windows’ Category

XP end of life, migrate in a few simple steps

Now that Windows XP end of life is here, if you are one of those companies still hanging on, there’s likely a bit of panic on what exactly to do. Well there is good news, bad news, and then some more good news. If you are in an industry that has heavy governing compliance, like healthcare, you need to be a little more concerned because you are now in violation of regulations.

If you are not under the microscope of government compliance, then you need not fear. There isn’t going to be any major concern if you don’t jump immediately, but you probably want to begin planning, and make the move within the next year.

XP End of Life. Migrate in a Few Simple StepsIf you are one of those heavily regulated companies with big brother looking over your shoulder, than guess what? Time to get the show on the road. Since you don’t have a lot of time, here’s some good advice to get the job done smoothly and quickly without a lot of headache:

  1. System Center Configuration Manager 2012 – With this Microsoft tool, you will be able to perform Zero Touch installations for your whole organization fairly quickly. The key to leveraging this tool to its fullest is getting your SCCM infrastructure scaled properly and your applications packaged quickly. This product can also manage devices if a BYOD (Bring Your Own Device) ends up being the path taken.
  2. Go with Windows 7 – With Windows 7, you’ll still have a similar look and feel to XP, which end users are used to. Going to an entirely new platform like Windows 8 requires more time and will likely also require a lot more training and transitioning with end users. Sticking with something familiar will reduce the shock to the end user base.
  3. Out with the old & in with the new – If it has been awhile since you have introduced new desktops and laptops, this would be a good time to bite the bullet and have it done. Most manufactures offer programs to preload your company images and apps, leaving only the task of migrating the user data. Also, this might be a good time to go with a BYOD solution, where you virtualize the apps and stream to the device the users choose. SCCM can manage this out of the box.
  4. KIS (keep it simple ) – Companies nowadays have allowed complexity to run riot. Unfortunately, I have seen an exorbitant amount of time and money spent because of bureaucracy, rather than the actual time doing the work. If you are one of those companies that fell into the trap of losing the balance between security and flexibility due to an absence of checks and balances, well… I feel your pain. This has become a disease that has infected the IT world and has become the cause of so much complexity and profit loss for very little benefit. It’s hard to fathom (and I will save this for another blog). Get the right project team, with individuals that are high enough up the corporate ladder to make decisions across multiple departments. In other words, your CIO might need to be little more involved in this one. Also, go with the new methods, approaches and technology platforms. The need for massive testing labs and creating a bare metal image for every department is over. All your testing and image development can be done through a few simple virtual instances, secured and managed by SCCM 2012 security.
  5. The right team – One of the biggest mistakes I’ve recently experienced was simply having the wrong people managing the project. Windows desktops are best managed by Microsoft Windows professionals, not by the guy who used to manage the development department that can only think Agile. Agile and Microsoft infrastructure don’t mix well, and you will only add complexity and prolong a fairly straight forward task that needs to be completed.

If you are looking for a consulting team, find one with System Center 2012 experience. This will make the job so much more pleasant and easy to carry out.

That said, I know I’m leaving out a lot of information, but I think I have touched on the most important things to consider if you need to get your company migrated quickly. The most important thing to remember is, go after the Goliath first, as once that is out of the way, everything else will likely run smooth. If you ignore the Goliath, well… good luck.

End of Windows XP Support What Now Windows 7 or 8

After a twelve-year run, the end of life for Windows XP is finally here. So what does this mean for those on XP still? Well in a nutshell support and updates will no longer be available, many machines will be unprotected, out of compliance and will open the door for vulnerabilities.

The big question going forward is do I make the big jump to Windows 8, and get the latest operating system or do I go with the small leap to Windows 7. Typically this comes down to company culture, strict business needs or are the architecture and deployment tools in place to make this all happen.

Jumping to a new Operating System is never easy or painless, there are many things to take into consideration hardware, application compatibility, deployment methods, training etc. Thankfully Microsoft has tools available to help aid in the process, ACT (Application Compatibility Tool Kit) MAP (Microsoft Assessment and Planning) SCCM (System Center Configuration Manager) to name a few.

This is  all great, here I am stuck on Windows XP an Operating System that is no longer supported, what should I  do move to Windows 7 or go to Windows 8?

Read the rest of this post »

Declarative data caching in .NET business layer

One of the most effective ways to improve application performance is to implement data caching. Most of the applications are relatively retrieving the same data from external sources like database of web service and if that source data is never or seldom changes then application is just wasting CPU time and I/O querying the source again and again. There are a great many ways to cache data in application, there are different software frameworks, standalone applications and even caching appliances, but probably the most widespread and easiest way to cache data is a built-in .NET MemoryCache class which is providing a simple in-process caching since .NET 4.0. cache

Read the rest of this post »

Busy Pre-Build week for Microsoft and Azure!

The Microsoft Build Conference is set to kick off next week but the company got off to an early start this week with several different announcements.

Windows Azure now generally available in China
This may not sound like a huge accomplishment worthy of being called out individually but a little known fact is that Windows Azure is the first major public cloud service that China has made available.  This opens Azure up to an enormous user base that cloud competitors Google and Amazon don’t yet have access to.

Windows Azure will soon be re-branded Microsoft Azure
In an effort to strengthen the Azure brand, Microsoft is removing “Windows” from the name.  This is the help emphasize that the Azure platform is completely open and a variety of technologies can utilize it, not just Microsoft and Windows based technology.  The name “Windows Azure” has been a source of confusion since its introduction.  People who are new to cloud computing often did not know if only technologies supported by Windows were designed to work on the Azure platform.  This name change should clear up any lingering confusion.

Office for iPad debuts along with Enterprise Mobility Suite 
On Thursday Microsoft announced a fully functional, touch friendly edition of their Office suite tailored for iPads.  This has been a long time coming as millions of iPad users have had to find other methods of editing documents on their tablets.  The entire Office suite is free to download and use to view documents and presentations.  In order to edit documents an Office 365 subscription is needed, priced at $99 a year.  This subscription also provides you with desktop versions of Office 2013 as well as an Exchange Online account.

The Enterprise Mobile Suite is aimed to bring Single Sign On to all users for a variety of devices across services.  This would allow an Android tablet, iPad or Windows 8 machine using Azure Active Directory to authenticate against Office 365, Dynamics CRM and Windows Intune  as well as a variety of already available third party products.  This allows Microsoft technologies to be at the very core of the Enterprise Cloud while allowing users to “Bring Your Own Device”.

Microsoft is sure to provide more insight into this strategy next week at the Build Conference, in addition to their future road map for Windows!

#Lync and the Impacts of Windows XP

It’s no secret Microsoft is doing the same to XP as the bad boy trio from Office Space did to that poor Printer.
The deprecation of XP will have an impact on organizations for various reasons, some of which I’m not qualified to speak in depth about, but a key topics on the wire as of late is particularly around security. Without a steady stream of updates and patches, you leave your environment largely susceptible to attack.

Lync and the Impacts of Windows XPI can speak more intelligently and qualified around XP in the workplace and how it works with Lync…or how it DOESN’T really, kind of a little, maybe…work with Lync. Huh?
Let me explain. The Windows XP OS has been dropped from backward support-ability with Lync Server 2013. Microsoft knew long ago, XP was going to be killed off during the reign of Lync Server 2013, so they are essentially forcing your hand to upgrade. It’s a fair hand to be played in defense of Microsoft, at some point we have to move on and put to rest the aging systems to focus on improving existing and future releases, so don’t look at this as a strong arm play by Microsoft, it’s just simply evolution.

If you are considering the move to Lync Server 2013, understand that any pockets of existing XP machines need to be upgraded to at least Windows 7 for the Lync 2013 client to install. If you do not upgrade, your users will be left with Lync 2010 or OCS 2007 R2 (MOC) client and that’s not cool.
Start reviewing Client Interoperability and Support here.

Keeping Lync 2010 client in your environment because of XP is not ideal. It works and its supported, but its just not perfect. Expect that you will find feature and functionality caveats and shortcomings, plus multiple support streams and image packages. Yuck!

If you are upgrading from OCS 2007 R2 Platform to Lync Server 2013, another knock against replacing the MOC client with Lync 2010 client just to justify the retention the XP OS, is user adoption. If you introduce Lync 2010, then plan to introduce Lync 2013 or maybe even the next rev of the Lync client over an accelerated timeline to get your OS’s upgraded, you essentially press change upon your users more times than needed. Change would essentially happen like this for your users:
1.) Introduce new Lync 2010 Client
2.) Introduce new OS
3.) Introduce new Lync 2013 Client
Simply put, this is not ideal.

If you hit the OS upgrade button now, change would look like this:
1.) Introduce new OS and Lync 2013 Client at the same time during the same roll out of a single package.
This strategy has much less of an impact on your sensitive user base.

The MOC client, however, is much much different.
First and foremost, you can’t join a Lync Conference using MOC. ALL you get with MOC, is IM and Presence, so that is an incredibly big disadvantage of using the MOC client as a stop gap.

Second, the MOC client does not support DNS Load Balancing as the Lync Clients do. This could cause an impact as well if you feel your users need HA. If you keep the MOC client on the desktops and move to a Lync Server 2013 back-end, you will need to configure or purchase an HLB to maintain SIP communication HA, no exceptions. All of this JUST for IM&P?
If you move to the Lync Client immediately, you can take advantage of the DNSLB mechanism built into the Lync client to maintain SIP communication HA. Keep in mind, however, HLB is still required for load balancing the web communications required by Lync, but sizing of the HLB can be dramatically reduced.

So the moral of the story, please look to upgrade as soon as possible. Your organization is only limiting itself by trying to squeeze every last breath from XP. The OS is dead, time to move on and allow the grieving process to run its course.

How Windows Azure delivers the Olympics

NBC and Microsoft recently publicized they are streaming every event of the 2014 Winter Olympics to any iOS, Android, Windows device using Windows Azure Media Services.  What is Windows Azure Media Services (WAMS) and how does it work?

WAMS is a cloud optimized edition of the Microsoft Media Platform (MMP) which handles a variety of tasks such as format conversion, media encryption, analytics with on-demand and live streaming capabilities.  Microsoft Media Platform is traditionally confined to a server farm but by leveraging Windows Azure WAMS has nearly limitless compute and streaming capabilities.

How Windows Azure Delivers the OlympicsWhen considering infrastructure it is important to consider which configuration we will be using.  There are two options to consider Infrastructure as a Service and Platform as a Service.

  • Infrastructure as a Service (IaaS).  Using this method we must setup and configure virtual machines (VM) to connect to our WAMS setup.  To utilize IaaS auto-scaling we must create additional VM’s to handle requests when demand is high.  This means we must forecast an approximate number of active streaming requests, create the right amount of VM’s to handle the requests and turn on the auto-scale feature to utilize the dormant, yet pre-configured VM’s.
  • Platform as a Service (PaaS).  Using PaaS there is no extensive VM configuration.  After deploying your cloud service and configuring IIS once you can now depend on Azure to auto-scale your cloud service automatically without having to configure additional VM’s for a “just incase” scenario.  There will be no need to forecast the number of concurrent requests at any given time.  As long as IIS is setup to provide on-demand and live streaming media correctly once then it is setup for your cloud application no matter how great the demand.  Essentially by giving up some control in configuration we can save a lot of work.  This is the method most likely being utilized to deliver the Olympics.

The setup for live streaming and on-demand will slightly differ in how they are captured and consumed by the public.

  • The live streaming setup involves the footage being captured, encoded and then sent to web roles in Azure (typically referred to as an “ingest” server).  This can work with a single web role but for redundancy additional web roles can be used.  The additional  web roles can consume the data as long as they are at a different DNS address.  In this situation multiple web roles are probably used for world wide redundancy.  As the data is being pushed to the cloud content delivery web roles begin to pull the data and push it to the requesting parties.
  • On Demand streaming does not require the high speed capture and encoding of live footage but does require an enormous amount of storage capacity.  Every event during the Olympics will be available for on demand streaming, which means every even must be captured and stored in Azure blob storage.  Every event is being captured in full HD (1920 x 1080 resolution).  You can imagine this will amount to a substantial amount of data, probably several terabytes. While the live streaming web roles need to pull the streaming encoded content, the on demand web roles need to the stream the media files.  Sending a full HD stream to a device such as a cell phone with limited bandwidth is not the most efficient distribution process so Azure utilizes a technology called Smooth Stream.

Smooth Stream is a dynamic content delivery technology that will adapt the stream that is sent to the requester based on their bandwidth.  It is being utilized for both on-demand and live streaming events.  In order to deliver content at a consistent frame rate free of lag or pixelation the video is broken up into small fragments.  As the fragments are delivered and played, the time it took to play the fragment as well as the playback quality will be sent back to Azure.  If the quality or playback time does not meet standards set on the server then the next fragment will be sent at a lower quality and the process will repeat.  If bandwidth increases during the next fragment a higher quality version of the next fragment will be sent.  As you can imagine this means every Olympic event needs to be stored in full HD and in several tiers of lower quality fragments to deliver content to every type of device over any kind of bandwidth.

The Olympics is no doubt one of the most watched events of the year.  By utilizing dozens of Azure data centers capturing, replicating and delivering content all over the world Microsoft is once again showing the power of what can be accomplished using Windows Azure.  Microsoft began streaming the Olympics in 2008 and since has quietly become a media streaming powerhouse with the ability to deliver content to millions at a moments notice.

Windows Azure: PaaS changing the landscape of online gaming

Titanfall is a new blockbuster game for the Xbox One.  It is being published by Electronic Arts and is due to be released in March 2014.  Titanfall is a first person shooter that will have much of its AI hosting , physics calculations, online match making and multi-player dedicated servers hosted in Windows Azure.  This means several things:

  1. Azure’s IaaS provides dedicated servers for multi-player games providing near infinite bandwidth with low server pings and anti-cheat enabledWindows Azure: PaaS Changing the Landscape of Online Gaming
  2. Azure’s PaaS is being utilized to provide physics calculations and specialized AI to learn your style of play
  3. PaaS and dedicated servers auto scale to provide fast dynamic content to players around the world on a consistent scale

Multi-player infrastructure background

Traditionally multi-player games have been played using a client/server paradigm.  This paradigm generally involves a computer acting a dedicated server for the game.  This dedicated server accepts connections from a specific amount of players and handles communication between the clients/players.  The server normally does not perform any game relevant calculations but would act as a central repository where players send update information which would then be distributed and consumed by every client.

Recently the game development community has moved away from the dedicated server model due to operational cost and replaced it with a player-host model.  The player-host model essentially means that one player hosts the game and every other player connects to the host.  This new paradigm has several disadvantages to network multi-player gaming but was implemented to save costs on running dedicated servers as game hosts.  A few of the obvious disadvantages to the player-host model are:

  1. Inconsistent bandwidth and server lag of the player chosen to be the host
  2. No anti-cheat enabled on host
  3. Slower updates / increased lag due to server not being dedicated
  4. Local player receives faster updates than other players

How Azure fixes this

The dependence on a cloud infrastructure for a fast paced reactionary game is a significant leap of faith.  Video games generally run in a continuous loop created by the game engine to repeatedly update all of the game data (AI, particles, physics, player movement, event handling etc.) and then draw that data to the screen.  It takes a  substantial amount of CPU and GPU power to calculate and render all of the in-game objects at speeds necessary to achieve the target of 60 frames per second.

The developer of Titanfall, Respawn Entertainment, is utilizing Azure PaaS to handle several expensive calculations normally performed by the local host (console or PC).  These calculations are typically done on the local host so the player experiences minimal lag.  With these calculations off loaded to the cloud and not affecting any game play, it allows the developers to optimize the Xbox One hardware to handle more graphically intense environments.  This strategy could also extend the life of the Xbox One even further in the future.

Cloud computing services such as Azure have allowed dedicated servers to once again be economical.  With automatic server scaling and incredibly cheap virtual machine costs, the server cost and total hours of man power have been significantly reduced.  The more calculations that are performed in the cloud the more you can do with the hardware available.  Another way to look at this is, the more calculations you can do in the cloud significantly impacts the entry point for other hardware platforms.  If a developer is able to process 90% of intense calculations on an Azure compute cluster then the hardware needed to play the game can be anything from a tablet to a workstation.  This has the opportunity to increase the install base substantially.

Games are real time applications that depend on milliseconds and timing.  Azure is effectively performing calculations for a real time application and delivering results to multiple parties simultaneously.  If the Titanfall launch performs well, expect hundreds of future Xbox One games to utilize Windows Azure in making the cloud (and Azure) a dominant force in multi-player gaming for years to come.

Windows Server 2012 R2 Hyper-V – Overview of Generation 2 VM’s

With the release of Windows Server 2012 R2 comes many great new features, including a improved virtual machine named generation 2.

Generation 2 virtual machines provide quite a few enhancements across the spectrum of Hyper-V VM technology. Perhaps most notable is the removal of legacy emulated hardware. Removal of the legacy network adapter, IDE controller, floppy controller, serial controller (COM ports), and PCI bus, results in a more efficient VM. You should see faster boot times, and quicker installations from .iso. How does a VM boot without these integral components? Where necessary, they have been replaced with software based versions.

Other enhancements include:

  • Replaced BIOS with UEFI (Unified Extensible Firmware Interface)
    • Faster boot times
    • Support for boot volumes up to 64TB (Uses GPT instead of MBR)
  • Enhanced Security
    • Smaller attack surface
    • Secure Boot – Prevents unauthorized firmware, drivers and OS from running during boot.
  • Expansion of data and boot disks while VM is running. Nice!
  • Complete reliance on VHDX file format resulting in much better performance (VHD’s are no longer supported).
  • Enhanced Session Mode
    • This allows device redirection and the ability to control display configuration when connected via the Virtual Machine Connection tool.

Some things to keep in mind with generation 2 machines: Read the rest of this post »

Windows Azure: What is Platform as a Service?

What is Platform as a Service (PaaS)?  How does it differ from Infrastructure as a Service (Iaas)?

Let’s start with IaaS.  When “The Cloud” first became popular IaaS was the target.  The point of IaaS is to migrate a company data center into Windows Azure.  This involves converting whatever physical servers you have to Hyper-V virtual machines and upload the contents to Azure(or sending them via FedEx for Microsoft to upload).  While this process is cumbersome and time consuming it does work and has some great benefits.  The next step in this process is to create a secured link between your former data center and the Azure data center.  Doing this will allow your users connectivity to all of their pre-existing applications.  In fact, your users should not even notice that the data center has been moved.  Everything should operate exactly how it did prior to the migration.  Depending on the size of your infrastructure IaaS has the ability to save you time, space, money and will allow IT admins to breathe easy knowing they don’t have to be overly concerned with server hardware failure.  From that perspective, IaaS saves you money because you have zero server maintenance / replacement cost.  Another benefit of IaaS is that all of your existing third party software is guaranteed to work on IaaS.  If a business has an old Citrix application that is critical for the accounting team, then migrating to IaaS will guarantee that application still works as expected.  The problem with IaaS is that you are still running an entire infrastructure.  IT will still have to manage the servers, Active Directory, patches, and updates.  In short, you aren’t benefiting from a lot of the major benefits of the cloud.

PaaS aims to solve that.

PAASPaaS in Azure is synonymous with “Cloud Service”.  The target of PaaS is custom business applications that will replace your existing third party software  infrastructure.  These applications can be written in either .NET or Java.  In PaaS you essentially rent the hardware the application runs on.  Renting the hardware means you do not have administrative access to the virtual machines powering the application.  The permissions are set so that Remote Desktop to the virtual machine is enabled and users can administer IIS.

What is the advantage of renting hardware the application runs on?   Why wouldn’t I simply use IaaS to retain tighter control over the environment?

Azure manages the infrastructure powering the application so IT administrators no longer have to worry about it.  The infrastructure used to power a global application with millions of users generating massive amounts of traffic (like SnapChat)  is massive.  The work and time it takes to manage that infrastructure is several times larger.  With PaaS you are able to eliminate the management of the environment the app runs on.  It is also cheaper to run an application with PaaS than creating dozens upon dozens of virtual machines.  Scaling is much easier with PaaS.  It is possible to scale to a massive size very quickly(and automatically) with PaaS.  To do the same with IaaS, virtual machines must be created in geographically relevant locations.

PaaS provides most of the flexibility that IaaS provides.  By sacrificing a little control administrators and developers are able to automate most of the infrastructure maintenance that comes with growing web applications.  This not only saves money but allows developers to create new features without having to worry if the infrastructure can handle it.  Whether the application is of massive global scale like SnapChat or simply a small application which applies business logic to users; PaaS on Azure maximizes productivity by removing laborious infrastructure maintenance.

Create cross platform apps in C# with Xamarin

Xamarin and Microsoft have teamed up to make all other development platforms irrelevant.  Xamarin is the creator of popular cross platform development tools that allow developers to create iOS, Android and Windows applications all in C#.  With the launch of Visual Studio 2013, Xamarin and Microsoft announced a partnership that will significantly improve the experience of developing, maintaining and updating apps written for any of the major popular platforms (iOS, Android, Windows).

xamarinSome of highlights of this partnership include Portable Class Libraries, Visual Studio integration, Azure Mobile Services integration and licensing discounts with free training for all MSDN Subscribers.

Portable Class Libraries (PCL) are libraries of code that can be used in any of your projects.  PCL’s have made cross platform development easier than ever before.  By using PCL’s you can keep the specific platform code within their respective projects and keep the bulk of your logic within the PCL.  Using this method will speed up development, code maintenance and bug fixing considerably.

Previous to the Visual Studio 2013 partnership Xamarin came with its own cross platform development environment, Xamarin Studio.  While still very functional it was no Visual Studio.  Developers not familiar with Xamarin Studio would still have to take the time to re-learn the tools that were available to them.  Now with full Visual Studio integration developers can continue to use the tools they are already comfortable with as well as using the powerful Azure utilities when developing apps that require mobile services.

Windows Azure has become one of Microsoft’s fastest growing platform.  It has been experiencing 100% year over year growth and just announced it has been gaining 1,000 new customers per day!  Microsoft has built templates specific for Xamarin iOS and Xamarin Android apps so developers can simply download project templates with sample code prepopulated and making API calls to Azure!  Creating mobile services has never been easier.  For more information on this process, please visit this link.

The final point is one I’m considerably excited about.  Along with the Microsoft partnership Xamarin also introduced Xamarin University.  For .NET developers that would like to learn more about mobile development Xamarin University is a great place to look.  It provides live online classes, tutorials, labs and a certification exam.  If you are an MSDN subscriber you have access to Xamarin University for free!  A value of over $1400!!!  So sign up while there is still space.  Class starts January 20th!