by February 26th, 2014
The Perficient Sitecore team has been writing feverishly over the past few months to publish several new guides. Most recently, Mark Servais, a Sr. Technical Consultant within Perficient’s Sitecore practice and Sitecore MVP, authored a new paper, Four Ways to Successfully Integrate Your Product Catalog with Sitecore.
Sitecore is a content management platform that is very flexible in terms of its ability to integrate data and extend functionality. This guide explains several different approaches and best practices to integrating products into Sitecore, depending on your product catalog scenario. Of course, every integration is going to be unique in that each company has distinct practices, with diverse systems housing the product data, and various criteria surrounding the interaction with that data. Mark points out that you’ll want to take sales, pricing, attributes, and regional offerings into account as applicable.
Common scenarios include:
- The product catalog exists in Sitecore exclusively for the enterprise
- The product catalog exists outside of Sitecore and is managed outside of Sitecore
- The product catalog exists outside of Sitecore and is managed in multiple systems including Sitecore
- The product catalog exists outside of Sitecore and is managed in Sitecore
Mark goes into detail on each of the four ways, providing potential advantages and disadvantages depending on the scenario. He closes by recommending the use of a custom data provider to gain the most control and functionality, yet provides options if you are unable to do so. He also stresses the importance of data concurrency and consistency when thinking about how your content editors will interact with the product data.
You can download the full guide here.
And if you missed the other Sitecore guides, you can find them below:
by February 23rd, 2014
Background: A few days back as part of my .Net learning effort I had to deliver a task that depicts the below screen
When the end-user types the username, the application should validate the username from the database and display his / her training details in the dropdown box named TopicNames and LevelName. The application then should display the details in the four text boxes based on the selection made in the two combo boxes. The data in the four text boxes should refresh without refreshing the entire page, if the user changes the selections in the combo box. I had several difficulties to accomplish this task, as I was new to this technology. I wanted to share my lessons learned hoping it would be of great help to beginners or people who are not aware of Jquery in order to reduce redundant effort.
- Visual Studio 2012
- Database ( I used MySQL )
- ADO.Net Entity Framework
- JQuery used for connecting to Server without refreshing the page
Read the rest of this post »
by February 17th, 2014
In my previous post I discussed how Microsoft and NBC were streaming every single event live and on demand at the Sochi Olympic Games. Azure makes publishing and streaming videos easier than ever before. This post will walk you through creating a media service, uploading content (video or audio), encoding it and publishing it for consumption. We will do this all using the Azure management portal.
- To start log into Azure and go to the portal (if you don’t have an Azure account you can get one for free at http://windows.azure.com). Select ‘Media Services’ on the left hand navigation bar, then select ‘New’.
- Fill in the requested information. Note that in order to create your Media Service. You must use and existing storage account or create a new one. In this example we will be creating a new storage account. This storage account will hold all of the media that we would like to stream from Azure.
- After you Media Service is created your dashboard should look similar to this. Our next step is to upload some content. Click on the ‘Upload’ button.
- You can select content to stream in two ways. You can upload content you have stored locally on your computer, or you can “upload” content to Media Services that is already located in Azure blob storage. The content can be located in any storage account you have access to in Azure.
Read the rest of this post »
by February 17th, 2014
Over the course of a last few years Microsoft unleashed two new web development frameworks: Web API and SignalR, both are suitable for asynchronous communications between web client and web server. And, of course, we still have MVC controller actions that can be used for asynchronous communications too and can accept and return JSON objects. So, what’s the difference between these three frameworks and what are the best patterns and practices for using these?
As MVC is built on top of ASP.NET it inherits ASP.NET paradigms like session support. HTTP protocol is stateless by it’s definition, however ASP.NET is supporting user session state. Being statefull also means thread affinity.
2. Web API is looking very similar to MVC: there are controllers, routes and filters. However, Web API is tracing it’s roots from the different source: WCF. Because of that, Web API doesn’t have a dependency from ASP.NET and could potentially be hosted on a web server which is different from IIS or could be self-hosted in application. Web API is stateless, asynchronous (Task<T> could be used as a return type for actions) and there are no thread affinity. Web API is very aware of HTTP verbs (like GET, PUT, DELETE, etc) and so it’s completely restful. In fact, the default routing setup for Web API doesn’t include action into the route.
by February 12th, 2014
SiteMinder is an enterprise-class secure single sign-on solution by CA (Computer Associates) which is employed by many large companies to secure their intranet access and provide single sign-on functionality to various intranet applications. SiteMinder has a broad support for different application frameworks which is making possible to use in heterogeneous enterprise environment.
For example, when SiteMinder is used to secure ASP.NET/IIS application then it’s normally configured as IIS handler. For example (in web.config):
<add name="handler-wa-32" path="*" verb="*" modules="IsapiModule" scriptProcessor="C:\Program Files\CA\webagent\win32\bin\ISAPI6WebAgent.dll" resourceType="Unspecified" requireAccess="None" preCondition="classicMode,bitness32" />
||SiteMinder module is intercepting every request to ASP.NET application resource and authenticating and authorizing user. If user is authenticated and authorized successfully then SiteMinder is passing the request further down the pipeline to ASP.NET.
So, how too integrate SiteMinder authentication with ASP.NET MVC authentication? SiteMinder is doing a great job for handling it on it’s own, but quite often MVC application will need to doit’s own, custom authorization in order to grant or deny user access to different resources, depending on user role.
by February 11th, 2014
If your web application is built using ASP.NET MVC stack and it requires user authentication and authorization to access a certain parts of the application (or application as a whole), then the chances are that you using [Authorize] controller attribute. This attribute could be applied to controller as a whole or to any of the controller actions and it acts as a request pre-filter, checking if user is authorized, and if not then directing user to the login page.
Read the rest of this post »
by February 10th, 2014
NBC and Microsoft recently publicized they are streaming every event of the 2014 Winter Olympics to any iOS, Android, Windows device using Windows Azure Media Services. What is Windows Azure Media Services (WAMS) and how does it work?
WAMS is a cloud optimized edition of the Microsoft Media Platform (MMP) which handles a variety of tasks such as format conversion, media encryption, analytics with on-demand and live streaming capabilities. Microsoft Media Platform is traditionally confined to a server farm but by leveraging Windows Azure WAMS has nearly limitless compute and streaming capabilities.
When considering infrastructure it is important to consider which configuration we will be using. There are two options to consider Infrastructure as a Service and Platform as a Service.
- Infrastructure as a Service (IaaS). Using this method we must setup and configure virtual machines (VM) to connect to our WAMS setup. To utilize IaaS auto-scaling we must create additional VM’s to handle requests when demand is high. This means we must forecast an approximate number of active streaming requests, create the right amount of VM’s to handle the requests and turn on the auto-scale feature to utilize the dormant, yet pre-configured VM’s.
- Platform as a Service (PaaS). Using PaaS there is no extensive VM configuration. After deploying your cloud service and configuring IIS once you can now depend on Azure to auto-scale your cloud service automatically without having to configure additional VM’s for a “just incase” scenario. There will be no need to forecast the number of concurrent requests at any given time. As long as IIS is setup to provide on-demand and live streaming media correctly once then it is setup for your cloud application no matter how great the demand. Essentially by giving up some control in configuration we can save a lot of work. This is the method most likely being utilized to deliver the Olympics.
The setup for live streaming and on-demand will slightly differ in how they are captured and consumed by the public.
- The live streaming setup involves the footage being captured, encoded and then sent to web roles in Azure (typically referred to as an “ingest” server). This can work with a single web role but for redundancy additional web roles can be used. The additional web roles can consume the data as long as they are at a different DNS address. In this situation multiple web roles are probably used for world wide redundancy. As the data is being pushed to the cloud content delivery web roles begin to pull the data and push it to the requesting parties.
- On Demand streaming does not require the high speed capture and encoding of live footage but does require an enormous amount of storage capacity. Every event during the Olympics will be available for on demand streaming, which means every even must be captured and stored in Azure blob storage. Every event is being captured in full HD (1920 x 1080 resolution). You can imagine this will amount to a substantial amount of data, probably several terabytes. While the live streaming web roles need to pull the streaming encoded content, the on demand web roles need to the stream the media files. Sending a full HD stream to a device such as a cell phone with limited bandwidth is not the most efficient distribution process so Azure utilizes a technology called Smooth Stream.
Smooth Stream is a dynamic content delivery technology that will adapt the stream that is sent to the requester based on their bandwidth. It is being utilized for both on-demand and live streaming events. In order to deliver content at a consistent frame rate free of lag or pixelation the video is broken up into small fragments. As the fragments are delivered and played, the time it took to play the fragment as well as the playback quality will be sent back to Azure. If the quality or playback time does not meet standards set on the server then the next fragment will be sent at a lower quality and the process will repeat. If bandwidth increases during the next fragment a higher quality version of the next fragment will be sent. As you can imagine this means every Olympic event needs to be stored in full HD and in several tiers of lower quality fragments to deliver content to every type of device over any kind of bandwidth.
The Olympics is no doubt one of the most watched events of the year. By utilizing dozens of Azure data centers capturing, replicating and delivering content all over the world Microsoft is once again showing the power of what can be accomplished using Windows Azure. Microsoft began streaming the Olympics in 2008 and since has quietly become a media streaming powerhouse with the ability to deliver content to millions at a moments notice.
by February 7th, 2014
||This blog post is third and final in series about MVC anti-forgery (CSRF) token.
Part 2.As we talked about it earlier, MVC have a great built-in functionality for securing form posts with anti-forgery tokens and it’s even possible make it work across multiple web applications.
However, these days modern web applications tend to have more asynchronous (AJAX) communication between browser and web server than traditional HTML form posts where the whole page is reloaded. The question is, can built-in MVC components to be used for CSRF validation when browser code is using AJAX to post to the server?
Obviously, it can’t be used directly because @Html.AntiForgeryToken only works when it’s placed inside HTML form and that form is submitted to the server. In case of AJAX post there is no form, so the AJAX controller method will not receive a form CSRF token (cookie token though will flow with the AJAX post normally). However, we can make it work with a little of extra coding…
Read the rest of this post »
by February 6th, 2014
||In the previous installment of this post series I talked about CSRF attack and how to prevent it using ASP.NET MVC built in components. Today I want to dive deeper into the framework code and show you what’s under the hood to anti-forgery token implementation in MVC.
Some time ago Microsoft took a huge step forward and open sourced complete ASP.NET MVC and Web API stack. Now developers can see what’s actually happens inside the framework and don’t have to rely solely on Microsoft documentation. The source code for MVC stack is located at http://aspnetwebstack.codeplex.com/.
As you recall, there are two components that provide CRSF protection when used together – AntiForgeryToken methjod of Html helper (@Html.AntiForgeryToken()) which should be called from inside HTML in Razor view and ValidateAntiForgeryTokenAttribute ([ValidateAntiForgeryToken]) which should be applied to controller to validate tokens. Both of these classes are actually a thin wrappers on top of the AntiForgery class. AntiForgery class is a static class which is encapsulating all functionality for generating and validating tokens. Source code could be found there. This is a public class and could be used directly if somebody will decide to implement a custom generation and validation of anti-forgery tokens. In turn, AntiForgery is using other helper classes like AntiForgeryWorker and TokenValidator. Unlike AntiForgery these classes are internal and can’t be used directly by application code.
So, why it’s important to look into internal implementation of anti-forgery token generation and validation?
Read the rest of this post »
by February 5th, 2014
Securing your web application is now more important than ever because various security attacks are growing in numbers and becoming more sophisticated and frequent. One of the most common types of attacks is Cross Site Request Forgery (CSRF) attack. In this kind of attack malicious web sites are hijacking a previously authenticated user sessions to exploit your web site.
Consider the following example: you web site is using ASP.NET Forms Authentication. User is authenticated on login page and user authenticated session is maintained using standard .ASPXAUTH cookie. Without closing the browser window or logging off your site user is visiting a malicious site which can (using social engineering, like displaying some sort of a false message to the user) now cross-post to your site (using a standard HTTP form post) and that post will be bearing a valid .ASPXAUTH cookie issued by your site. So, unless your web site employs some special measures, your web site server code will not be able to distinguish a valid post from your web site from a post from malicious site. Note, that implementing HTTPS on every page your your site will not solve this issue as malicious site can post over HTTPS too. HTTPS can only prevent the traffic between your web site and web client to be hijacked and analyzed, but in case of CSRF attack the attacker doesn’t need to analyze your traffic, it just reuses the authenticated user session.
Read the rest of this post »