Perficient Enterprise Information Solutions Blog

Blog Categories

Subscribe via Email

Subscribe to RSS feed

Archives

Follow Enterprise Information Technology on Pinterest

Archive for the ‘Microsoft’ Category

The Industrialization of Advanced Analytics

Gartner recently released its predictions on this topic in a report entitled, “Predicts 2015: A Step Change in the Industrialization of Advanced Analytics”. This has very interesting and important implications for all companies aspiring to become more of a digital business. The report states that failure to do so impacts mission-critical activities such as acquiring new customers, doing more cross-selling and predicting failures or demand.

shutterstock_167204534Specifically, business, technology and BI leaders must consider:

  • Developing new uses cases using data as a hypothesis generator, data-driven innovation and new approaches to governance.
  • Emergence of analytics marketplaces, which Gartner predicts will be more commonly offered in a Platform as a Service model (PaaS) by 25% of solution vendors by 2016
  • Solutions based on the following parameters: optimum scalability, ease of deployment, micro-collaboration and macro-collaboration and mechanisms for data optimization
  • Convergence of data discovery and predictive analytics tools
  • Expanding technologies advancing analytics solutions: cloud computing, parallel processing and in-memory computing
  • “Ensemble-learning” and “deep learning”. The former defined as synergistically combining predictive models through machine-learning algorithms to derive a more valuable single output from the ensemble. In comparison, deep learning achieves higher levels of classification and prediction accuracy through the development of additional processing layers in neural networks.
  • Data lakes (raw, largely unfiltered data) vs data warehouses and solutions for enabling exploration of the former and improving business optimization for the latter
  • Tools that bring data science and analytics to “citizen data scientists”, who’ll soon outnumber skilled data scientists 5-to-1

Leaders in the emerging analytics marketplace, include:

  • Microsoft with its Azure Machine Learning offering
    • For further info, check out: https://blogs.perficient.com/microsoft/2014/12/azure-ml-on-the-forefront-of-advanced-analytics/
  • IBM with its Bluemix offering

Finally, strategy and process improvement, while being fundamental and foundational, aren’t enough. The volume and complexity of big data along with the convergence between data science and analytics requires technology-enabled business solutions to transform companies into effective digital businesses. Perficient’s broad portfolio of services, intellectual capital and strategic vendor partnerships with emerging and leading big data, analytics and BI solution providers can help.

SSRS – Have you used it yet?

While there are several BI technologies and more coming into the foray every day, SSRS has remained a key player in this area for quite some time now.  One of the biggest advantages of SSRS reporting is that it involves the participation of the end user and that is very intuitive to use.

Let’s go back few years when excel was the go to tool for dash boarding.  Every time a director or VP wanted a report, he would go to his developers to extract information from the database to help him make dashboards for his meetings.  The end user had to rely on the developers to extract information and had to spend several minutes if not hours to make a dashboard.  This all works ok when the meeting is scheduled for a specific day of the week or month.  We all know this is a myth and most meetings happen impromptu.  In such cases, there is not enough time to extract data and to extrapolate that information into graphs.

Here is why SSRS came in as a key player.  With a strong foundation of Microsoft, SSRS brought in some of the best features and much needed features:

  • Easy connection to databases
  • User friendly interface allowing users to design reports and make changes on the fly.
  • Report generation on a button click.
  • Subscription based delivery to deliver reports on a specific day and time of the month.

 

While these features may not look ground breaking in the first look, these features actually bring in a lot of value.  These features save a lot of time and that time in business directly translates into revenue.  The developers can design dashboards once and deploy them to a server.  The VP or director can press a button to get these reports on his machine.  Furthermore, the reports can be exported in several formats.  What I really like about the reports though is the look and feel.  Microsoft retained the aesthetics of MS excel reports and by that I mean that you can have a pie chart in excel and in SSRS look exactly same.  This is a great feature especially for the audience since it most people do not like to see the look of the reports change over time.  Another great feature is that SSRS has fantastic security options and one can implement a role based reporting.

In summary, SSRS is a power packed tool and you should reap benefits of the great features that come with it.

For information on Microsoft’s future BI roadmap and self-service BI options check out this post over on our Microsoft blog
 
 

 

Displaying Custom Messages in SSRS.

SSRS is a powerful tool not because it projects queries in really good looking charts but because it enhances user experience.  The reports are so intuitive that users can navigate and export data without much training.  However, as business analyst or data analysts or report designers, it is our responsibility to extend these usability features to our users at every step.

We know that if sql returns no rows, SSRS will display empty tables to our customers. As a report designer and a user myself, empty table would worry me and would force me to think that reports are not pulling data correctly. Instead, I would like to see or show a clear message indicating why there is no data.  For example: No records found and so on.

Lets see how to add custom messages in SSRS.

For illustration, I have created a dummy table that contains 3 Columns:

1.)    Product

2.)    Product_Detail

3.)    Count

cm1

 

 

 

 

 

In order to add a custom message:

1.)    Select your table

2.)    Go to Properties

3.)    Scroll down to No Rows Message and Type Your Message in the Box.

cm2

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

You can also change the Color and Font in the No Rows Section of the Table Properties.

This is how my report looks after I add my No Rows message J

cm3

 

 

 

 

Stay Tuned for more :)

Creating Table of Contents for SSRS reports

Table of contents have always helped readers to navigate through thick volume of books. This feature can be extended to our users in SSRS to navigate through several pages of reports. The table of contents in SSRS is called the document map. A document map is a clickable table of contents that takes the user directly to that part of the report that he/she wants to see. For example: Consider a library with hundreds and thousands of books. Books are categorized into paperback and hardcover. Furthermore, these books are categorized into genres such as Fiction, murder mystery, biographies, etc. The document map will be particularly helpful for a librarian, who wants to see a list of all hardcover fiction books.

Let’s see how a document is created and how the usability feature can be extended to our users.

For illustration, I have created a tabular report using a wizard. For those interested, this is how my table looks.

dm1

 

 

 

 

 

 

 

 

Product Types here are Candles,Hand Sanitizers and Soaps.

Product detail here is the type of fragrance and In Store is a date field to indicate when the product arrived in store.

When I run my report, I see that are 20 pages of data. Let’s say I want to find in store data for fragrance type = “Mint”. I would have to find what product the fragrance belongs to. In doing so, I may have to go through the entire result set. Let’s create a document map and see how that can help us.

One thing that we know before creating a document map is that “Mint” is a product detail and therefore, we would need a document map on this field.

Go to your canvas and under Row Groups, click on Product_Detail. Go to the Advanced tab and under Document map, select Product_Detail in the drop down.

dm2

 

 

 

 

 

 

 

 

 

 

dm3

 

 

 

 

 

 

 

 

 

 

 

 

Click on Ok and run the report.

Your report should like the screenshot given below. Clicking on any of the product types will take you to that data point.

dm4

 

 

 

 

 

 

 

 

Stay Tuned for more J

 

Evaluating In-Memory DBs

This month Oracle is releasing its new in-memory database.   Essentially, it is an option that leverages and extends the existing RDBMs code base.   Now with Microsoft’s recent entry all four the mega-vendors (IBM, SAP, Microsoft, and Oracle) have in-memory database products.

Evaluating In-Memory DB'sWhich one that is a best fit for a company will depend on a number of factors. If a company is happy with their present RDBMs vendor, then that standard should be evaluated first. However, if a company has more than one RDMBs vendor or if they are looking to make a switch, a more comparative evaluation is needed. In this case companies should evaluate:

  1. Maturity of the Offering. All the vendors products have different support “traditional” RDMS functionality like referential integrity, support for stored procedures, and online backups – to name a few.   Make sure you understand the vendor’s current and near-term support for features that you require.
  2. Performance. All IMDB vendors promise and from all accounts, deliver significantly increased performance.   However, the vendor’s ability to provide the level of desired performance on the company’s proposed query profile and the ability of the vendor’s technology to scale out should be evaluated. Compression and columnar storage will also affect performance, so understanding these features to support a company’s requirements is necessary.
  3. Sourcing of On-Disk Data. Probably the biggest difference in architecture and maturity between the vendors is their ability to source data from on disk storage systems, either files, traditional RDBMs, or Hadoop systems.
  4. Licensing & Cost Model. The costs associated with a licensing and implementing a technology need to be closely evaluated. How much training is required to develop a competency with a new technology? Is the licensing model favorable to how an enterprise uses/purchases licenses?

There are other evaluation areas, as well. For instance with SAP’s HANA offering has a robust BI metadata layer (think Business Objects Universe) that may be of value for a number of companies.

In-Memory Databases are changing and evolving quickly. So, make sure the appropriate due diligence is competed before investing in a selected technology.

More on the MDM platform…

Picking up from my earlier blog post, there are two kinds of MDM tool types, one targets specific domain (Customer and Product are the most common ones) and the others follow a multi-domain (Customer, Product, Location, Supplier etc. all in one) strategy. Most of the analysis I found are either for Customer Domain or Product Domain, which includes multi-domain types as well.

So to round-up the top list equitably, I looked at Gartner research as well, thanks to the vendors, most of the reports are in public domain. There is a report from Gartner which you can buy, if you need complete analysis and understanding. Not sure how one gets on the list of these research. But I am assuming, if the market share of a tool is big enough or the technology is way superior, the tool should have made the list. Just a disclaimer, my intention is not to write  research paper but just commentary and some observation.

I looked at 2009, 2011/12 and 2013 magic quadrants for Product and Customer MDM. We see few more companies and some missing ones. Going back to my Forrester slide from 2007 (See my earlier blog), gives us an idea of type of companies approaching MDM and then retreating.

Reading the market news, and from my client experience, most of the medium to large enterprises do fall within the list of vendors we are seeing here. But there are other vendors very much in the market. Also my feeling is that the traditional Data Management software vendors are gaining market share through consolidation and through improved product lines. I am sure market will continue to surprise with new products and services. Microsoft is still playing a low-key in MDM space. Robust MDM from Microsoft will be a game changer.

What is your observation? What is your experience?

customer_mdm

product_mdm

MDM Tool Vendor Landscape

My exposure to Master Data Management as a tool and all the surrounding process, organization and platforms dates back to 2005 in one form or another. MDM as a tool and its expected functionality are evolving constantly. I was curious to see what MDM tools and vendor landscape looked like in 2006 compared to MDM Tools as it stands in 2014. MDM market typically has been a fragmented market place with major market share (Over 50%) among the small vendors.

As with any new technology, start-ups go for the market share until the consolidation happens. So let’s look at the charts and see how the market place has changed. My quick observation is that the big companies with no core Data Management expertise vanished along with their MDM products. Some of the data rich companies stayed within that domain (D&B still has an MDM product).  So the large software vendors has secured their dominance in terms of product offering and market share, though a lot of small vendors are still in the market. My experience is that MDM is gravitating towards a tool with bells & whistles. But two major themes remain strong, MDM for specific Domain and  Multi-Domain MDM. I also find big vendors have multiple MDM products and they may consolidate those products. I got a kick out of seeing some of the familiar but non-existent companies. Enjoy!

MDM_tool_1  mdm_tool_3

Yarn – The Big Data Accelerator

Yarn….. Yes, Hadoop may be changing everything, but when Yarn was released, the change pedal has been pushed aggressively to the floor. Putting the technical details aside, the bottom-line is that now multiple concurrent workloads can be executed and managed on Hadoop clusters. This “pluggable” service layer has separated the data processing and cluster resource management layer. Result is that we are not dependent on MapReduce to access and process HDFS data.

Yarn - the Big Data AcceleratorMost companies with products accessing HDFS data are doing it without MapReduce. Oracle, SAS, IBM and many niche providers run their own software components on the data nodes. This will change the dynamics of how we construct clusters. More memory and more CPU will be required to support these additional processing requirements. It is too early to tell if we should beef up our nodes or add more nodes. Short of running your own POC and tests, keep an eye on the “all-in-one” appliance vendors as they bring out their new appliances in the year. How they move will be a good indicator.

Does any vendor have a “silver bullet”?   Until these solutions get into production and mature, there will be challenges.   However, they still will provide exceptional value creation – even with any associated headaches. Do not shy away. Do your due diligence and choose tools that leverage your current capabilities. Move forward, Big Data is here to stay and you need to move forward or be left behind. The accelerator has been pushed. Are you stuck in neutral or are you in the race to develop a competitive advantage from Big Data?

If you want to learn how to quickly gain value from your Big Data; contact Perficient!

“Accelerate your Insights” – Indeed!

I have to say, I was very excited today as I listened to Satya Nadella describe the capabilities of the new SQL 2014 Data Platform during the Accelerate your Insights event. My excitement wasn’t tweaked by the mechanical wizardry of working with a new DB platform, nor was it driven by a need to be the first to add another version label to my resume. Considering that I manage a national Business Intelligence practice, my excitement was fueled by seeing Microsoft’s dedication to providing a truly ubiquitous analytic platform that addresses the rapidly changing needs of the clients I interact with on a daily basis.

If you’ve followed the BI/DW space for any length of time you’re surely familiar with the explosion of data, the need for self-service analytics and perhaps even the power of in-memory computing models. You probably also know that the Microsoft BI platform has several new tools (e.g. PowerPivot, Power View, etc.) that run inside of Excel while leveraging the latest in in-memory technology.

PeopleDataAnalytics But… to be able to expand your analysis into the Internet of Things (IoT) with a new Azure Intelligent Systems Service and apply new advanced algorithms all while empowering your ‘data culture’ through new hybrid architectures…, that was news to me!

OK, to be fair, part of that last paragraph wasn’t announced during the key note, it came from meetings I attended earlier this week and that I’m not at liberty to discuss, but suffice it to say, I see the vision!

What is the vision? The vision is that every company should consider what their Data Dividend is.


DataDividend
Diagram: Microsoft Data Dividend Formula

Why am I so happy to see this vision stated the way it is? Because for years I’ve evangelized to my clients to think of their data as a ‘strategic asset’. And like any asset, if given the proper care and feeding, you should expect a return on it! Holy cow and hallelujah, someone is singing my song!! :-)

What does this vision mean for our clients? From a technical standpoint it means the traditional DW, although still useful, is an antiquated model. It means hybrid architectures are our future. It means the modern DW may not be recognizable to those slow to adopt.

From a business standpoint it means that we are one step closer to being constrained only by our imaginations on what we can analyze and how we’ll do it. It means we are one step closer to incorporating ambient intelligence into our analytical platforms.

So, in future posts and an upcoming webinar on the modern DW, let’s imagine…

Even Row Distribution on SSRS report

SSRS report tends to hold maximum number of rows it can fit on a page.  The number of rows varies according to the page size, row width, location of the table, etc.  My report contains 99 records and these records are unevenly distributed throughout the report.  The report I created holds 36 rows on page 1, 38 on page 2 and remaining on page 3.  Today, we will learn to evenly distribute rows across several pages of the report.

I have created a simple table for illustration.  This table contains 3 fields:

1.)   ID

2.)   Product_type

3.)   Product_detail

I am querying all the records from my table and displaying the result in my SSRS table report.  For those interested, here is the SQL query:

SELECT

*

FROM

[candle_soap_shop$]

My requirement is that I want to display 10 rows per page.  This means I will have 10 rows on page 1 to 9 and the remaining 9 rows on page 10.

To do that, I will have to make groups of 10 rows and display these groups on different pages.  It seems complicated but it really isn’t.

The first thing to do is right click on Details and click on Add Group and select Parent Group.

pic1

 

 

 

 

 

 

 

 

 

You will see a pop up window.  Instead of choosing the options from drop down, click on fx.  fx is a button to enter mathematical formula.  This button is also found in Microsoft Excel.

Read the rest of this post »