Microsoft

Blog Categories

Subscribe to RSS feed

Archives

Archive for the ‘.NET’ Category

Declarative data caching in .NET business layer

One of the most effective ways to improve application performance is to implement data caching. Most of the applications are relatively retrieving the same data from external sources like database of web service and if that source data is never or seldom changes then application is just wasting CPU time and I/O querying the source again and again. There are a great many ways to cache data in application, there are different software frameworks, standalone applications and even caching appliances, but probably the most widespread and easiest way to cache data is a built-in .NET MemoryCache class which is providing a simple in-process caching since .NET 4.0. cache

Read the rest of this post »

Strengthen Company Culture with Yammer enhanced by HDInsight

In a world of broadband internet connections, online collaboration tools and the ability to work from almost anywhere – office culture can be difficult to sustain.  This especially holds true for people who live in large cities (where the commute can be problematic) or in harsh climates (like the never ending winter in Chicago this year).   Yammer can help by creating remote social interactions.

Strengthen Company Culture with Yammer enhanced by HDInsightYammer is an enterprise social network that aims to connect people in the office.  A few of its features are instant messaging, user profiles, a primary news-feed, interest groups, recommendations for people to follow, groups to join as well and a recent activity feed.  The interface is clean and well designed.  One of the great things is that once you start using Yammer it is really easy to continue.

There is one area where Yammer seems to fall short.  There is no clear way to bring people together who have common interests.  The other users and groups that are recommended to me by Yammer are made based on the groups I am a part of and people I follow.  It does not take into consideration any of the data in my user profile.

Perficient recently held a hack-a-thon where my team identified this short coming.  Social interaction via online collaboration tools wasn’t cutting it.  In an online culture how can we leverage all of our tools to help facilitate more meaningful social gatherings?  The answer was to use interest data that co-workers have provided through Yammer to generate meaningful recommendations.  A Yammer profile consists of many different “interest groups”.  It lists categories such as Expertise, Interests, Previous Company and Schools Attended.  All of these can be classified as conversation topics and can be used as a common social interest.

This is where HDInsight powered by Hadoop and Mahout can help.  Mahout can consume massive quantities of information and return logical connections represented within the data.  For additional reading about Hadoop and Mahout click here.

Using an HDInsight Hadoop cluster in coordination with the Mahout recommendation engine we could provide meaningful recommendations to users based on their individual interests.  This wouldn’t just recommend topics that a user might be interested in but also groups they could create or join with other users based on their mutual interests – similar to the recommendations Facebook suggests regarding people you may know, groups to join or pages you may like.

Creating these logical, online groups would “connect the dots” to uncover a similarity between people where it might otherwise remain hidden.  It could also help facilitate in-person group outings, social gatherings or simply more friends and comraderie in the office.  Through this you are creating a more meaningful environment aided by technology.

A thriving office culture can stand out in a world where telecommuting tends to be more convenient.  This may not convince everyone to come to the office. However, instead of viewing it as obligatory, implementing a solution like this can encourage more people to choose to commute to the office for the social comraderie.  All of this can be done for free through the Yammer API and a Windows Azure account.

Windows Azure: Retiring Windows Server 2008 and how to upgrade

Beginning on June 2, 2014 Windows Azure will be retiring Windows Server 2008.  This means that you will no longer be able to deploy a new Cloud Service or manage your existing services on virtual machines running Windows Server 2008.

Windows Azure: Retiring Windows Server and how to UpgradeWindows Azure currently supports four different GuestOS ‘versions’:

  • GuestOS 1.x – Windows Server 2008
  • GuestOS 2.x – Windows Server 2008 R2
  • GuestOS 3.x – Windows Server 2012
  • GuestOS 4.x – Windows Server 2012 R2

If your Cloud Service has not been upgraded and is still running on Windows Server 2008 you must upgrade the servers that power your service.  How do you do that?  Isn’t the point of a running a PaaS cloud service instead of using IaaS to handle the operating system and hardware for me?  The short answer is yes, but…

PaaS will take care of much of the hardware, IIS patches and OS patches for you but Azure will not do entire OS upgrades for your entire service unless you tell it to.  This happens because incompatibilities between cloud services and operating systems are likely to arise.  This would cause developers to try and fix code on the fly.  That is not only bad for up time but could also come with some very serious security holes.

Thankfully, living in a world where you have to manually upgrade the server OS for your service is in the past.  Azure makes it easy to upgrade the guest OS for your service.  You can even have your production service remain on Windows Server 2008 while upgrading your staging environment and deploying your service there.  This will allow developers to fix any outstanding bugs that are introduced with the operating system upgrade.

How do you upgrade your staging environment?  It is pretty straight forward.  From the cloud service dashboard select your staging environment and choose Configure.  At the bottom of the page find the operating system section.  You will see drop down menus for OS Family and OS Version.  Select proper OS Family (in this case anything but 1.x) and OS Version.  To always have the most up to date OS Version select automatic.  This ensures your cloud service will always be running on the latest Azure VM that is available.  If you do not want this select a static version of an OS.  This guarantees that your cloud service will remain running this OS until you upgrade it in the future.

When the service is cleared for production it is time to configure your production environment.  Upgrading your production environment can lead to some downtime for your service, but there is a way to avoid it.  Normally you will need to configure your staging and production environment independently but now you can swap your staging and production environments using the Swap option in the portal.  This will effectively swap your staging environment into production.  The change will happen within seconds and any downtime experienced will be minimal.

After the swap you can rebuild and configure the former production environment, which is now your staging environment to match that of your current production environment.

Guide to Integrating Your Product Catalog with Sitecore

The Perficient Sitecore team has been writing feverishly over the past few months to publish several new guides. Most recently, Mark Servais, a Sr. Technical Consultant within Perficient’s Sitecore practice and Sitecore MVP, authored a new paper, Four Ways to Successfully Integrate Your Product Catalog with Sitecore.

Sitecore Product CatalogSitecore is a content management platform that is very flexible in terms of its ability to integrate data and extend functionality. This guide explains several different approaches and best practices to integrating products into Sitecore, depending on your product catalog scenario. Of course, every integration is going to be unique in that each company has distinct practices, with diverse systems housing the product data, and various criteria surrounding the interaction with that data. Mark points out that you’ll want to take sales, pricing, attributes, and regional offerings into account as applicable.

Common scenarios include:

  1. The product catalog exists in Sitecore exclusively for the enterprise
  2. The product catalog exists outside of Sitecore and is managed outside of Sitecore
  3. The product catalog exists outside of Sitecore and is managed in multiple systems including Sitecore
  4. The product catalog exists outside of Sitecore and is managed in Sitecore

Mark goes into detail on each of the four ways, providing potential advantages and disadvantages depending on the scenario. He closes by recommending the use of a custom data provider to gain the most control and functionality, yet provides options if you are unable to do so. He also stresses the importance of data concurrency and consistency when thinking about how your content editors will interact with the product data.

You can download the full guide here.

And if you missed the other Sitecore guides, you can find them below:

ASP.net Cascading Dropdownbox and Jquery

Background: A few days back as part of my .Net learning effort I had to deliver a task that depicts the below screen sampledemo

When the end-user types the username, the application should validate the username from the database and display his / her training details in the dropdown box named TopicNames and LevelName. The application then should display the details in the four text boxes based on the selection made in the two combo boxes. The data in the four text boxes should refresh without refreshing the entire page, if the user changes the selections in the combo box. I had several difficulties to accomplish this task, as I was new to this technology.  I wanted to share my lessons learned hoping it would be of great help to beginners or people who are not aware of Jquery in order to reduce redundant effort.

Project Technology

  • Visual Studio 2012
  • Database ( I used MySQL  )
  • ADO.Net Entity Framework
  • MVC
  • JQuery used for connecting to Server without refreshing the page

Read the rest of this post »

Windows Azure: How to create a streaming media service (GUI)

In my previous post I discussed how Microsoft and NBC were streaming every single event live and on demand at the Sochi Olympic Games.  Azure makes publishing and streaming videos easier than ever before.  This post will walk you through creating a media service, uploading content (video or audio), encoding it and publishing it for consumption.  We will do this all using the Azure management portal.

  • To start log into Azure and go to the portal (if you don’t have an Azure account you can get one for free at http://windows.azure.com).  Select ‘Media Services’ on the left hand navigation bar, then select ‘New’. NewMediaService
  • Fill in the requested information.  Note that in order to create your Media Service.  You must use and existing storage account or create a new one.  In this example we will be creating a new storage account.  This storage account will hold all of the media that we would like to stream from Azure.
    NewMediaService2
  • After you Media Service is created your dashboard should look similar to this.  Our next step is to upload some content.  Click on the ‘Upload’ button.
    NewMediaService3
  • You can select content to stream in two ways.  You can upload content you have stored locally on your computer, or you can “upload” content to Media Services that is already located in Azure blob storage.  The content can be located in any storage account you have access to in Azure.

Read the rest of this post »

MVC controller actions vs Web API vs SignalR – what to use?

Over the course of a last few years Microsoft unleashed two new web development frameworks: Web API and SignalR, both are suitable for asynchronous communications between web client and web server.  And, of course, we still have MVC controller actions that can be used for asynchronous communications too and can accept and return JSON objects. So, what’s the difference between these three frameworks and what are the best patterns and practices for using these?

communication 1. First, the MVC controller actions. ASP.NET MVC framework is a layer on top of good old ASP.NET and it was originally built support and traditional synchronous web development architecture where controller action is generating HTML as a response to HTTP requests and accepting HTTP form posts when the whole page is reloaded.  However, it’s also possible to call a controller action asynchronously from javascript passing JSON object as a parameter and getting JSON in response.
As MVC is built on top of ASP.NET it inherits ASP.NET paradigms like session support. HTTP protocol is stateless by it’s definition, however ASP.NET is supporting user session state.  Being statefull also means thread affinity.

2. Web API is looking very similar to MVC: there are controllers,  routes and filters. However, Web API is tracing it’s roots from the different source: WCF. Because of that, Web API doesn’t have a dependency from ASP.NET and could potentially be hosted on a web server which is different from IIS or could be self-hosted in application. Web API is stateless, asynchronous (Task<T> could be used as a return type for actions) and there are no thread affinity. Web API is very aware of HTTP verbs (like GET, PUT, DELETE, etc) and so it’s completely restful. In fact, the default routing setup for Web API doesn’t include action into the route.

Read the rest of this post »

Integrating ASP.NET MVC authentication with SiteMinder SSO

SiteMinder is an enterprise-class secure single sign-on solution by CA (Computer Associates) which is employed by many large companies to secure their intranet access and provide single sign-on functionality to various intranet applications.  SiteMinder has a broad support for different application frameworks which is making possible to use in heterogeneous enterprise environment.
For example, when SiteMinder is used to secure ASP.NET/IIS application then it’s normally configured as IIS handler. For example (in web.config):

<add name="handler-wa-32" path="*" verb="*" modules="IsapiModule" scriptProcessor="C:\Program Files\CA\webagent\win32\bin\ISAPI6WebAgent.dll" resourceType="Unspecified" requireAccess="None" preCondition="classicMode,bitness32" />
sso SiteMinder module is intercepting every request to ASP.NET application resource and authenticating and authorizing user. If user is authenticated and authorized successfully then SiteMinder is passing the request further down the pipeline to ASP.NET.
So, how too integrate SiteMinder authentication with ASP.NET MVC authentication? SiteMinder is doing a great job for handling it on it’s own, but quite often MVC application will need to doit’s own, custom authorization in order to grant or deny user access to different resources, depending on user role.

Read the rest of this post »

Gracefully handle MVC login session expiration in javascript

If your web application is built using ASP.NET MVC stack and it requires user authentication and authorization to access a certain parts of the application (or application as a whole), then the chances are that you using [Authorize] controller attribute. This attribute could be applied to controller as a whole or to any of the controller actions and it acts as a request pre-filter, checking if user is authorized, and if not then directing user to the login page.

The [Authorize] attribute is working great for a traditional MVC application where web page content is refreshed by reloading a whole page. But let’s say that you have a single-page application or hybrid page where parts of your pages are served by javascript code which is talking to your application controller asynchronously, using AJAX. Will [Authorize] attribute work well for such controller methods? Not that much. Let’s see what’s happening inside [Authorize] attribute and what it’s returning to the client/browser… session_expired

Read the rest of this post »

How Windows Azure delivers the Olympics

NBC and Microsoft recently publicized they are streaming every event of the 2014 Winter Olympics to any iOS, Android, Windows device using Windows Azure Media Services.  What is Windows Azure Media Services (WAMS) and how does it work?

WAMS is a cloud optimized edition of the Microsoft Media Platform (MMP) which handles a variety of tasks such as format conversion, media encryption, analytics with on-demand and live streaming capabilities.  Microsoft Media Platform is traditionally confined to a server farm but by leveraging Windows Azure WAMS has nearly limitless compute and streaming capabilities.

How Windows Azure Delivers the OlympicsWhen considering infrastructure it is important to consider which configuration we will be using.  There are two options to consider Infrastructure as a Service and Platform as a Service.

  • Infrastructure as a Service (IaaS).  Using this method we must setup and configure virtual machines (VM) to connect to our WAMS setup.  To utilize IaaS auto-scaling we must create additional VM’s to handle requests when demand is high.  This means we must forecast an approximate number of active streaming requests, create the right amount of VM’s to handle the requests and turn on the auto-scale feature to utilize the dormant, yet pre-configured VM’s.
  • Platform as a Service (PaaS).  Using PaaS there is no extensive VM configuration.  After deploying your cloud service and configuring IIS once you can now depend on Azure to auto-scale your cloud service automatically without having to configure additional VM’s for a “just incase” scenario.  There will be no need to forecast the number of concurrent requests at any given time.  As long as IIS is setup to provide on-demand and live streaming media correctly once then it is setup for your cloud application no matter how great the demand.  Essentially by giving up some control in configuration we can save a lot of work.  This is the method most likely being utilized to deliver the Olympics.

The setup for live streaming and on-demand will slightly differ in how they are captured and consumed by the public.

  • The live streaming setup involves the footage being captured, encoded and then sent to web roles in Azure (typically referred to as an “ingest” server).  This can work with a single web role but for redundancy additional web roles can be used.  The additional  web roles can consume the data as long as they are at a different DNS address.  In this situation multiple web roles are probably used for world wide redundancy.  As the data is being pushed to the cloud content delivery web roles begin to pull the data and push it to the requesting parties.
  • On Demand streaming does not require the high speed capture and encoding of live footage but does require an enormous amount of storage capacity.  Every event during the Olympics will be available for on demand streaming, which means every even must be captured and stored in Azure blob storage.  Every event is being captured in full HD (1920 x 1080 resolution).  You can imagine this will amount to a substantial amount of data, probably several terabytes. While the live streaming web roles need to pull the streaming encoded content, the on demand web roles need to the stream the media files.  Sending a full HD stream to a device such as a cell phone with limited bandwidth is not the most efficient distribution process so Azure utilizes a technology called Smooth Stream.

Smooth Stream is a dynamic content delivery technology that will adapt the stream that is sent to the requester based on their bandwidth.  It is being utilized for both on-demand and live streaming events.  In order to deliver content at a consistent frame rate free of lag or pixelation the video is broken up into small fragments.  As the fragments are delivered and played, the time it took to play the fragment as well as the playback quality will be sent back to Azure.  If the quality or playback time does not meet standards set on the server then the next fragment will be sent at a lower quality and the process will repeat.  If bandwidth increases during the next fragment a higher quality version of the next fragment will be sent.  As you can imagine this means every Olympic event needs to be stored in full HD and in several tiers of lower quality fragments to deliver content to every type of device over any kind of bandwidth.

The Olympics is no doubt one of the most watched events of the year.  By utilizing dozens of Azure data centers capturing, replicating and delivering content all over the world Microsoft is once again showing the power of what can be accomplished using Windows Azure.  Microsoft began streaming the Olympics in 2008 and since has quietly become a media streaming powerhouse with the ability to deliver content to millions at a moments notice.