Perficient Business Intelligence Solutions Blog

Blog Categories

Subscribe via Email

Subscribe to RSS feed

Duane Schafer

Posts by this author: RSS

“Accelerate your Insights” – Indeed!

I have to say, I was very excited today as I listened to Satya Nadella describe the capabilities of the new SQL 2014 Data Platform during the Accelerate your Insights event. My excitement wasn’t tweaked by the mechanical wizardry of working with a new DB platform, nor was it driven by a need to be the first to add another version label to my resume. Considering that I manage a national Business Intelligence practice, my excitement was fueled by seeing Microsoft’s dedication to providing a truly ubiquitous analytic platform that addresses the rapidly changing needs of the clients I interact with on a daily basis.

If you’ve followed the BI/DW space for any length of time you’re surely familiar with the explosion of data, the need for self-service analytics and perhaps even the power of in-memory computing models. You probably also know that the Microsoft BI platform has several new tools (e.g. PowerPivot, Power View, etc.) that run inside of Excel while leveraging the latest in in-memory technology.

PeopleDataAnalytics But… to be able to expand your analysis into the Internet of Things (IoT) with a new Azure Intelligent Systems Service and apply new advanced algorithms all while empowering your ‘data culture’ through new hybrid architectures…, that was news to me!

OK, to be fair, part of that last paragraph wasn’t announced during the key note, it came from meetings I attended earlier this week and that I’m not at liberty to discuss, but suffice it to say, I see the vision!

What is the vision? The vision is that every company should consider what their Data Dividend is.

Diagram: Microsoft Data Dividend Formula

Why am I so happy to see this vision stated the way it is? Because for years I’ve evangelized to my clients to think of their data as a ‘strategic asset’. And like any asset, if given the proper care and feeding, you should expect a return on it! Holy cow and hallelujah, someone is singing my song!! :-)

What does this vision mean for our clients? From a technical standpoint it means the traditional DW, although still useful, is an antiquated model. It means hybrid architectures are our future. It means the modern DW may not be recognizable to those slow to adopt.

From a business standpoint it means that we are one step closer to being constrained only by our imaginations on what we can analyze and how we’ll do it. It means we are one step closer to incorporating ambient intelligence into our analytical platforms.

So, in future posts and an upcoming webinar on the modern DW, let’s imagine…

Technology Confusion

While returning from a client presentation and reflecting on the meeting conversations I was struck by a similarity that seems to be creeping into the minds of our clients.

While discussing our approach to performing a strategy assessment for this new client we were reviewing an example architectural diagram and a question was raised. One of the business sponsors commented that the ‘Operational Data Store’ that was referenced on the diagram seemed like an ‘archaic’ term from the past that may not be appropriate for their new platform. I explained that they may need a hybrid environment and that each technology had its place.

However, on the plane ride home I realized that I had heard a similar question just a few weeks earlier. I was at a different client, manufacturing as opposed to software, in a different part of the country, speaking about a different type of proposal, although both would have resulted in architectural enhancements, and a stakeholder asked about ‘new data warehouse’ technology such as Hadoop replacing the ‘older’ data warehouse paradigm we were discussing.

On both occasions I knew that the client wasn’t challenging my ideas as much as wanting to understand my recommendations better. What I knew in my head, but had failed to initially describe to both clients was the concept of ‘replacement’ technologies versus ‘complimentary’ technologies. Honestly it had never occurred to me that I needed to make such a designation since I wasn’t recommending both technologies. The client introduced the newer technology into the discussion at which point I fell victim to the assumption that both clients had a base understanding of what the different technologies were used for.

To be clear, we’re talking about the Big Data technology Hadoop and the well-known process of building a data warehouse with the Kimball or Inman approach. The former approach is relatively new and getting a lot of airplay as the latest thing. The latter approach is well known but has had its share of underwhelming successes.

So is Hadoop the new replacement for ‘traditional’ data warehousing? For that matter, is self-service BI a replacement for traditional dashboarding and reporting? How about Twitter, is it the replacement for traditional email or text messaging?

The answer is No. All of the technologies described are complementary technologies, not replacement technologies. These technologies offer additional capabilities with which to build more complete systems, but in some cases, certainly that of data warehousing, our clients are confusing them as carte blanche replacement options.

Considering that clear and concise messaging is fundamental to successful client engagements, I encourage all of our consultants to consider which category their respective technologies fall into and make sure your clients understand that positioning within their organization.

Why are BI Projects so difficult to implement? part 3

You can find the previous posts to this topic here: Part 1, Part 2, Part 2.5

So how do we reduce the amount of anxiety related to a BI project (thus making them less painful)? To start, the project team needs to be keenly aware of the following:

They must consistently over communicate that some process change will happen and what it is.

Why must they over communicate? It takes time for people to assimilate new ideas, and this is only after they actually start listening. A standard metric in television marketing states that a consumer must make 3 ‘mental connections’ to a TV ad before deciding if the product is relevant to them. Is your process change more interesting than that TV ad?

Ensure that everyone feels they are in this learning process together. In other words, don’t let some users put themselves on an island.

We’ve all been in a class at some point where someone started to fall behind and was too embarrassed to raise their hand, again. Don’t let your users fall behind.

Realize that the mechanical capabilities of moving through a new report or dashboard are not inherent in all users and may demand ‘more obvious’ training.

I just finished reading an interesting article about technology anxiety in an older workforce and the study cited interface design as a leading cause of anxiety; specifically the practice of designing an interface with a ‘layered menu’ system in which the user must remember that there are ‘invisible options’ and the sequence of actions to find them. Dashboards typically employ this functionality through the ‘right-click’ context menu.

I immediately thought of the introduction of the Microsoft ‘ribbon’ and if the ‘invisible option’ problem wasn’t a leading factor in that design change.

Finally, be prepared for the data anomaly effect.

I’ve written about this before but it remains to be relevant. Users of a new analytical platform need to be prepared for the fact that they will be faced with data anomalies. Some of these anomalies will turn out to be actual bugs, but some will not. Those that are not obvious bugs will require research. Research means delayed responses back to the business and delayed responses mean a frustrated business user, if the project team has not embraced the recommendation of over-communicating.

Over time, the number of bugs will decrease, but the number of research requests is likely to increase (as shown in the example below) especially as new users are rolled on.

Additionally, there are a series of events that can cause these numbers to fluctuate. Version releases are an obvious source for bugs, but what about a key team member leaving?

In the example below, during October of 2012, we see a spike in research requests but no spike in bugs and no additional users added. The only major event that occurred was a new member joining the team. We can infer that either the previous support person had direct lines of communication open to the users (very likely that hallway discussions answered some users questions) or that the new team member is recording ‘discussions’ differently. Regardless, this event may be a source of frustration to the business that has no direct tie to the functionality or stability of the system.

Production Support Analysis (Example)

In conclusion, there are a number of reasons why a BI project can be difficult to implement, but not all of them are related to ‘BI technology’ per se. Challenging fundamental truths, brain pain and good old fashioned learning anxiety play a big role in the perceived success of an implementation.

Why are BI projects so difficult to implement? part 2.5 I hadn’t planned on exploring the next piece of the puzzle so soon, but this article jumped out at me. I just finished reading “Achieving Greater Agility with Business Intelligence” from the TDWI and found several interesting comments.

The article was based around ‘faster decision cycles and competitive pressures’ but it had a few points that were relevant to our topic. For instance;

Page 7, paragraph 2 states “Amid this instability and increased—sometimes unexpected—competition, executives and managers doubt whether their forecasts will hold true. Operations managers have difficulty allocating resources and personnel because they lack confidence in their organizations’ planning and budget assumptions.” – Really? this sounds suspicially like ‘challenging fundamental truths’ from my part-2 article.

Page 10, paragraph 2: “For better agility, data has to be put in a format that makes it relevant to
the decision process.”
– Agreed, and introducing a change to a business process (paragraph 2 from my part-2 article ) can be a big hit to user adoption if not introduced properly.

Additionally, (from Page 10, paragraph 3) – “We asked research respondents how satisfied different types of users in their organizations are with their ability to access and analyze information to achieve objectives for which they are held accountable.” – And the highest level of satisfaction was from…. Finance.

Really? Finance? Well those people are just a bunch of number crun.. oh wait…, I get it, number crunches are most comfortable with BI applications because they don’t get brain pain as easily as others. That sounds familiar as well. :-)

OK, so we have shown that we’re on the right track with our ‘pain and difficulties’ theory, but how do we mitigate it? We’ll talk about options and responsibilities next time.

Why are BI projects so difficult to implement? part 2

In a previous post (here) I posed the question regarding the difficulty we experience when implementing a BI project, and promised to address mitigating the pain that our project teams and clients typically experience. But before we can do that, we need to explore where the pain and difficulty comes from.

- Unfamiliar technologies, black box processes and governance concepts are difficult to explain and when the business tries to visualize how these concepts interact with one another, the picture is cloudy at best.
- New business processes (e.g. drilling down in a dashboard vs. looking up a spreadsheet cell) have to be learned and the average age of our users don’t always bode well for this exercise.
- Fundamental ‘truths’ are often challenged with BI projects.

What do I mean by ‘fundamental truths’? The business is use to looking at their data through a very familiar lens. It could be Excel, it could be a report, but it’s familiar to them. They trust it. Or at least they know which pieces to trust. When a new way of viewing the data is introduced, it not only shines the light from a different direction (i.e. slice and dice), in some cases it exposes that their old trusted views were actually wrong. (and this is a hard conversation to have)

Now, we’re not forgetting that the actual development of the BI platform is difficult, but the people doing that actually enjoy that type of work, so it no longer factors into our discussion. (more…)

Microsoft is raising the bar on Self-service BI!

I’m attending the SharePoint conference in Las Vegas and man are there some sights to see. Sleek and inviting, everything you would want in a BI platform of course!

I work extensively with all of the technologies from the Microsoft BI stack, but have realized already this week that the changes coming in the SharePoint and Office 2013 wave will affect how we will address our customers analytic needs.

More on this soon…

SQL Server 2012 ad-hoc reporting in action

As the date of the SQL Server 2012 virtual release draws near, I’m getting more and more requests for client demos. And this should be no surprise as the new platform does show very well! So in an effort to spread the mindshare, here’s a link to a nice overview of PowerPivot and Power View from SQL Server 2012.

PowerPivot and Power View in action.

Remember, these demos can be tailored for specific scenarios so feel free to request a private showing!

EIM awareness with SQL Server 2012

I just finished reading the results of an interview I gave earlier this week to technology webzine regarding the pending release of SQL Server. I was interested to know which parts of my 30 minute discussion with the technical writer would make it into the article. I was happy to see that the writer picked up on a key point that I made, “that the platform is maturing across the EIM landscape”. Read the entire article here: CRN Article

So what does this mean? It means the SQL Server platform is now driving EIM awareness into organizations of all sizes. Is your organization ready?

Magic Quadrant Madness, Denali and complete transparency

So I just read the recently published BI magic quadrant analysis from Gartner and noticed a slight adjustment in the bubbles. The Microsoft BI platform continues to increase along the ‘Completeness of Vision’ axis (X), but the ‘Ability to Execute’ measure (Y-axis) slipped a notch and I wanted to know why.

After reading the ‘Strengths’ section of the report, a lengthy task in itself, I blinked once and completely missed the ‘Cautions’ section and had to go back and search for them again. :-) However, being transparent, I did find them and this is what they had to say; “…because Microsoft’s BI platform capabilities exist across three different tools (Office, SQL Server and SharePoint) that also perform non-BI functions, integrating the necessary components can be complex…” The report then goes on to say; “…Microsoft’s do-it-yourself approach puts more of the BI solutions development and integration onus for the platform components on customers…”.

And I would say, “Yes, this is true”, however, knowing the platform pretty well, I would also add;
– Office, SQL Server and SharePoint? so… that means I already have the licenses right? …hhmmm, that sounds about right..
– Integration complexity? well… it is an enterprise platform after all right? …why.. yes it is..

OK then, problem solved!

“Well hold on”, you might say. “We’re understaffed now and don’t really have time to learn about the latest BI rocket ship you know!”

So… would a primer on the new SQL Server 2012 “Denali” platform help you? “Why, yes it would!”

OK then…

Problem solved.

Microsoft BI in the enterprise

I was demoing the Microsoft BI stack to a client the other day and although PerformancePoint, PowerPivot and Power View showed well as always, I was amazed that the Q&A session gravitated back to the same old questions that I always hear.

“Well what about Oracle?”
“Have the BI tools made it out of the department yet?”
“Is it really an enterprise platform?”

Really? You mean to tell me you don’t get your news from Channel 9 MSDN like the rest of us?! Uh… election? What election?

OK, so maybe I’m a little more in tune with this than some of you but that’s the reason we’re here, so let’s take a look at how this platform stacks up against these questions.

To start with, let’s go straight to Gartner’s Magic Quadrant for BI platforms. I assume most people have seen this but maybe not. The top BI vendors get graded on several facets and ranked accordingly. Over the last few years Microsoft has consistently been in the upper right quadrant and there’s no reason to believe they’ll come out.

Next, let’s look at Gartner’s rankings of the top ECM platforms. Once again, we see Microsoft right where we want them. But wait, we’re talking about BI platforms, not ECM right? (more…)