Perficient Business Intelligence Solutions Blog

Blog Categories

Subscribe via Email

Subscribe to RSS feed

Posts Tagged ‘business intelligence’

KScope14 Session: The Reverse Star Schema

This week, Perficient is exhibiting and presenting at Kscope14 in Seattle, WA.  On Monday, June 23, my colleague Patrick Abram gave a great presentation on empowering restaurant operations through analytics.  An overview of Patrick’s presentation and Perficient’s retail-focused solutions can be found in Patrick’s blog post.

Today, Wednesday, June 25, I gave my presentation on Reverse Star Schemas, a logical implementation technique that addresses increasingly complex business questions.  Here is the abstract for my presentation:

It has long been accepted that classically designed dimensional models provide the foundations for effective Business Intelligence applications.  But what about those cases in which the facts and their related dimensions are not, in fact, the answers?  Introducing the Reverse Star Schema, a critical pillar of business driven Business Intelligence applications.  This session will run through the what’s, why’s, and when’s of Reverse Star Schemas, highlight real-world case studies at one of the nation’s top-tier health systems, demonstrate OBIEE implementation techniques, and prepare you for architecting the complex and sophisticated Business Intelligence applications of the future.

When implemented logically in OBIEE, the Reverse Star Schema empowers BI Architects and Developers to quickly deploy analytic environments and applications that address the complex questions of the mature business user.

Read the rest of this post »

KScope14 Session: Empower Mobile Restaurant Operations Analytics

Perficient is exhibiting and presenting this week at KScope14 in Seattle, WA. On Monday, June 23 I presented my retail-focused solution offering built upon the success of Perficient’s Retail Pathways, but using the Oracle suite of products. In order to focus the discussion to fit within a one hour window I chose restaurant operations to represent the solution.

Here is the abstract for my presentation.

Multi-unit, multi-concept restaurant companies face challenging reporting requirements. How should they compare promotion, holiday, and labor performance data across concepts? How should they maximize fraud detection capabilities? How should they arm restaurant operators with the data they need to react to changes affecting day-to-day operations as well as over-time goals? An industry-leading data model, integrated metadata, and prebuilt reports and dashboards deliver the answers to these questions and more. Deliver relevant, actionable mobile analytics for the restaurant industry with an integrated solution of Oracle Business Intelligence and Oracle Endeca Information Discovery.

We have tentatively chosen to brand this offering as Crave – Designed by Perficient. Powered by Oracle. This way we can differentiate this new Oracle-based offering from the current Retail Pathways offering.

Crave Logo

Read the rest of this post »

Is IT ready for Innovation in Information Management ?

Information Technology (IT) has come a long way from being a delivery organization to an organization part of business innovation strategy, though a lot has to change in the coming years. Depending on the industry and the company culture, IT organization will mostly fall in the operational spectrum and a lot of progressive ones are  gravitating towards innovation. Typically, IT maybe consulted on executing the strategic vision. It is not IT’s role to lead the business strategy but data and information is another story.  IT is uniquely positioned to innovation in Information Management because of their knowledge in data, if they don’t take up that challenge, business will look for outside innovation. Today’s market place offers tools and technologies to business users and they are bypassing IT organizations if they are not ready for the information challenge. A good example will be business users trying out third-party services (cloud), self-service BI tools for slicing and dicing data, cutting down the development cycle. The only way IT can play strategic game is to get into the game.

It is almost impossible for IT not to pay attention to data and just bury their heads in keeping the lights on projects. So I took a stab at the types of products and technologies which is maturing in the last 5 years in the Data Management space. By any means this is not the complete list but it captures the essence.DM_tools_x

Interesting phenomenon is many companies traditionally late to adopt data driven approach are using analytical tools as they become visually appealing and are at a price they can buy. Cloud adoption is another trend which is making the technology deployment and management without a huge IT bottleneck.

The question every IT organization, irrespective of company size, should ask is Are we ready to take on the strategic role in the enterprise? How well they can co-lead the business solution and not just implementing an application after the fact. Data Management is one area IT needs to take the lead in educating and leading innovation to solve business problems. Predictive analytics and Big Data is right on top with all the necessary supporting platforms including Data Quality, Master Data Management and Governance.

It will be interesting to know how many IT organizations leverage the Information Management opportunity.

DM_tools_list

 

How to Report on Employee Utilization in OBIEE?

One of the common HR reporting needs is to determine the Utilization and Availability of employees. These metrics may also be studied at a higher level. For example, checking Workforce Utilization Percentages across a company’s different organizations provides insight into how overstaffed or understaffed each organization is. This blog describes an OBIEE design methodology to support such reporting requirements.

A quick functional overview of how Utilization is calculated

While Utilization % tells how much actual work an employee has completed compared to their overall capacity, Availability indicates the remainder of the time where an employee has been inactive or non-utilizable. For example if the Utilization of someone is 80%, their Availability is 20% (100 – 80).

Utilization is defined as the ratio of Hours Worked over Capacity. Hours Worked is a function of the actual hours entered on a timecard throughout an employee’s workweek. And there may be several variations of what defines Hours Worked depending on the organization’s specific definition of the type of timecard hours that are utilizable. For instance, a consulting firm may include billable hours to a client as utilizable, but not hours spent on non-billable categories such as bench time and vacations. Capacity is typically a standard number of hours an employee is expected to work irrespective of what gets entered on timesheets. For example, an employee who works 8 hour workdays has a capacity of 40 hours a week, whereas a part-time employee who works 3 days a week has a capacity of 24. Capacity usually excludes standard holiday hours as such hours are not expected to be utilizable in the first place.

Following is a summary of the key metrics:

Utilization % = 100 x Hours Worked / Capacity

Availability % = 100 – Utilization %

Hours Worked: Timecard Hours that are considered utilizable

Capacity: Standard Work Schedule Hours – Standard Holiday Hours

 

Data Model

No matter what transactional system your data is sourced from, Hours Worked and Capacity are most likely going to be stored in different tables in that system. For example, in Oracle E-Business Suite, Hours Worked are sourced from Oracle Time and Labor timecard tables. Whereas, Capacity is sourced from the HR assignment tables that associate employees to their corresponding work schedules and holiday calendars.

In my solution of a data warehouse model that supports Utilization calculations, I use 2 facts: Timecard Fact and Capacity Fact. Not all the dimensions in both star schemas are conforming. For example, the Timecard Fact has dimensions that describe the type of hours whether they are billable or not, vacation hours or project hours, work hours that were performed onsite or remote, etc… Such attributes of a timecard are not relevant when we talk about capacity facts. For this reason, if we were to store both metrics (Hours Worked and Capacity Hours) in the same fact table, we end up with an incorrect capacity as it doesn’t relate to all the timecard dimensions. Following is my schema for both stars where Project, Task and Time Entry Status are non-conforming dimensions:

Capture1

 

OBIEE Design

In the RPD business layer, I built 3 logical facts and the same facts are made available in the Presentation layer:

  1. Timecard Fact: Sourced from the timecard OLAP fact table
  2. Capacity Fact: Sourced from the capacity OLAP fact table
  3. Utilization Fact: This fact has no physical data sources as all the metrics are based on the other 2 logical facts.

Capture2

I am now able to build a simple trend report that shows utilization broken down by Organization. Such a report is straightforward to build since both the Time and Organization dimensions are conforming between both facts: Timecard and Capacity.

Capture3

A more advanced reporting requirement may ask for utilization to be dynamically re-calculated in the report based on additional prompts on dimensions like Time Entry Status, Project or Task. These dimensions are not conforming and therefore cannot be added as prompts in the typical way. If interested in adding dynamic prompting on timecard-specific dimensions, you can see an example of how that is possible by referring to my other blog: OBIEE Prompting on Non-Conforming Dimensions.

More on the MDM platform…

Picking up from my earlier blog post, there are two kinds of MDM tool types, one targets specific domain (Customer and Product are the most common ones) and the others follow a multi-domain (Customer, Product, Location, Supplier etc. all in one) strategy. Most of the analysis I found are either for Customer Domain or Product Domain, which includes multi-domain types as well.

So to round-up the top list equitably, I looked at Gartner research as well, thanks to the vendors, most of the reports are in public domain. There is a report from Gartner which you can buy, if you need complete analysis and understanding. Not sure how one gets on the list of these research. But I am assuming, if the market share of a tool is big enough or the technology is way superior, the tool should have made the list. Just a disclaimer, my intention is not to write  research paper but just commentary and some observation.

I looked at 2009, 2011/12 and 2013 magic quadrants for Product and Customer MDM. We see few more companies and some missing ones. Going back to my Forrester slide from 2007 (See my earlier blog), gives us an idea of type of companies approaching MDM and then retreating.

Reading the market news, and from my client experience, most of the medium to large enterprises do fall within the list of vendors we are seeing here. But there are other vendors very much in the market. Also my feeling is that the traditional Data Management software vendors are gaining market share through consolidation and through improved product lines. I am sure market will continue to surprise with new products and services. Microsoft is still playing a low-key in MDM space. Robust MDM from Microsoft will be a game changer.

What is your observation? What is your experience?

customer_mdm

product_mdm

OBIEE Prompting on Non-Conformed Dimensions

A report that uses multiple facts may be prompted on dimensions that are not necessarily conforming to all the facts. At first one may think such a functionality is not valid. This posting demonstrates how such reporting requirements are common and are achievable in OBIEE though not in a very straightforward manner.

It is a basic OBIEE reporting concept that a report using metrics from more than one fact, requires that all the dimensional columns be conformed across the facts used in the report. In other words, it makes no sense to look at a side by side comparison of revenue and cost by product if the cost information is not available by product to start with. However, it is a valid question to ask how is revenue generated from certain products compared to the overall cost. Requirements like this usually have us facing the problem of developing a report that sources data from two facts: a revenue fact supporting a product dimension, and a cost fact that does not support the product dimension. At first one may be tempted to respond to the requester that a report like this is not possible since we are dealing with a “multiple facts and a non-conforming dimension” situation. But a closer look reveals that such requirements are completely valid from a functional perspective and therefore should be doable. The problem that remains though is that prompting a report on a non-conforming dimension will have OBIEE at a loss on how to aggregate a metric along a dimension it is not linked to.  Read the rest of this post »

MDM Tool Vendor Landscape

My exposure to Master Data Management as a tool and all the surrounding process, organization and platforms dates back to 2005 in one form or another. MDM as a tool and its expected functionality are evolving constantly. I was curious to see what MDM tools and vendor landscape looked like in 2006 compared to MDM Tools as it stands in 2014. MDM market typically has been a fragmented market place with major market share (Over 50%) among the small vendors.

As with any new technology, start-ups go for the market share until the consolidation happens. So let’s look at the charts and see how the market place has changed. My quick observation is that the big companies with no core Data Management expertise vanished along with their MDM products. Some of the data rich companies stayed within that domain (D&B still has an MDM product).  So the large software vendors has secured their dominance in terms of product offering and market share, though a lot of small vendors are still in the market. My experience is that MDM is gravitating towards a tool with bells & whistles. But two major themes remain strong, MDM for specific Domain and  Multi-Domain MDM. I also find big vendors have multiple MDM products and they may consolidate those products. I got a kick out of seeing some of the familiar but non-existent companies. Enjoy!

MDM_tool_1  mdm_tool_3

Stages of MDM…

MDM is a popular topic and many organizations are in different stages of MDM journey. Many times clients (primarily IT) want to engage consultants who can recommend a MDM tool and start the implementation, bypassing the Planning / Pre-planning stages. Typically this leads to a MDM solution which is not thought through completely or end up having similar Master Data problems even after implementing the tool.

One of my previous clients ran into the following situation after implementing MDM.  The IT department had a very capable CIO and a strong technical team. In this case IT drove the MDM implementation.  The team completed the MDM implementation successfully. But users hardly noticed the change or the improvement. The CIO recognized this right away and challenged his team to find a remedy to improve the perception. What happened?

mdm_stagesLet us step back and look at the big picture of MDM and the various stages one has to go through for a successful MDM. Three Major stages of MDM Journey consists of:

  • Planning / Pre-Planning stage
  • Development / Implementation stage
  • Steady state or ongoing support stage

Understanding the details of MDM will help align people, process and technology for these stages. Taking a holistic approach and developing the overall vision involves Business and IT working together. Analyzing the situation above told us:

The users (Business) were not engaged in all stages in the right roles and level Applies to all stages
Governance organization was not deep enough Applies to all stages
Clear communication of benefits and metrics to track them were not in place Planning, Steady State
Overall vision did not engage Business deep enough (ownership, monitoring) Planning, Steady state
Steady state stage was not fully thought out (e.g. Competency center etc.)  Steady State

 

At this point they reached outside the organization for help, to improve the perception and Business participation. IT understood the underlying technical MDM issues, and has even solved some of the complex Quality issues. But issue here was some of the approaches were fundamentally wrong. Granted this happened several years ago and one would imagine we would learn from these case studies and approach MDM  differently.  But to my surprise even today we get questions like “Can you suggest an MDM tool ?” casually, without thinking through the implications of embarking on an MDM journey.

Understanding each of the three MDM stages and engaging the business and communicating back is a critical part of a successful MDM program.  Part of the MDM challenge is to make the Business engage in defining the policies, performance metrics etc.  besides just implementing the MDM tool. In my experience, nimble and agile approaches are an option. But that doesn’t mean you don’t  take the time to understand the magnitude of the issue and lay out a well thought out strategy. Finally, MDM is more than just an IT solution, though it saves a lot of headache for IT, it is an ongoing partnership program with Business and IT.

In-Memory Database Solutions

After attending many marketing sessions on In-Memory database like HP Vertica, Oracle Exalytics, Kognitio, SAP HANA, and SQL*Server 2014 with in-memory capabilities, finally I got an opportunity to look under the hood.  I  attended the SAP HANA training, and had the firsthand experience in playing with the In-Memory database. Gartner says, at least 35% of the midsize and large organizations will adopt the In Memory Computing (IMC), in some form.

In-Memory database has been in the market for a while but looking at it for a serious use, needs more in-depth understanding. Here is my first impressions of what I saw besides the marketing brochures and power points.

IMC-2First let me just give you a quick definition, though not complete, to get you started. If you can imagine a database (all the DBA activities like creating the table, physical model) with standard DBA tasks plus analytics related tweaks like Aggregate functions, to create the summary tables and creating even a columnar table etc. that is in-memory database.

SAP pitches HANA primarily for their SAP ECC customers, especially to move the BW (Business Warehouse) to HANA for speed, though HANA can be used with other non-SAP environments. The biggest benefit is the data access speed. The demos and the sales presentations of other in-memory database vendors like HP-Vertica also pitched mainly the analytics speed. In that regard HANA is right up there as well. One caveat is that SUSE Linux is the only supported OS for HANA, though multiple HW vendors offer this solution.

The question is, if this database is faster, can it replace the traditional database? In general I don’t see that happening for many reasons, for starters the traditional data modeling tools don’t work with this database. It is far from fully evolved database solution, rather it depends on the underlying relational database. Having said that, theoretically, you can replace the traditional database with HANA for a given application. I did not see many vendors pushing that use case.

Finally, should companies include in-memory database in their EIM strategy? I think the time is ripe for using the in-memory database for specific analytical solutions, especially if the speed is critical. It does increase the investment in support and management of these technologies, as it involves hardware and software. Based on the industry trends and volume of data being analyzed, it makes sense to explore and invest in these technologies.

MDM starting troubles …

Today, Master Data Management (MDM)  is an accepted mainstream tool / program / application within corporate America. Large companies have multiple MDM solutions across business units, geographical units and even within the same business unit, they have different MDM for different domains. It is becoming a common sight to see MDM fully functioning and operating, despite steady state operational issues. But it is still a far-fetched idea for SMB  companies who have done business in a particular way for a long time.

sally.jpgHere is my version of When Harry met Sally story. I was consulting out-of-town and I got a call from a recruiter stating  that I should talk to this new  Director, let’s call him Bob (he is no Sally),  for a $2B manufacturing company. They were in the process of launching a new ERP implementation and needed help in Data Management. I met him and we talked, and he was so thrilled with the MDM idea,  he recommended to his executive leadership, to create a new role in Data Management. Nothing happened. Year later I was back in my home town attending a seminar. I met Bob again, introduced him to my company’s sales director and we talked enthusiastically for a while. He pulled in his VP of IT for discussion, and again, nothing happened. Year later I ran into him in another event and he said his company is merging with another company and he was still upbeat about Master Data Management.  In case you are wondering, he was not having the salad. This was 2 years ago. The point is, some companies don’t have the culture to embrace new technologies or concepts. It is the leaders within the company who can bring the company to the current century. Sometimes it is not possible to do all the internal selling without the external help.

Though it is a bit dated, and you may have seen various versions of these MDM excuses, it is time to re-visit them because it is still a valid roadblock, especially for the companies starting to entertain the MDM idea.  I have heard this several times in the past, surprisingly it is still in use:

  • We are implementing ERP – we don’t need MDM!
  • We have Data warehouse that should take care of it!

YES, You do!

Most ERP’s can  manage Master Data in their own environment. Multiple hierarchies and working with other ERP’s to manage similar Master Data is not addressed within the ERP world. There is no matching merging, lineage or managing history. Above all, there is no overarching Data Governance to manage the quality, accessibility and availability of data across applications. Data warehouse receives all these poor quality Master Data from upstream applications, only to cleanse in the end with mismatched transactions. There is no match/merge or governance done in the upstream systems resulting in data reconciliation nightmare. IT knows the situation, but selling internally takes different level of thinking and IT should take all the help they can to make it happen.

IT is in the hot seat to drive MDM  initiatives, if they do get approved. There are key strategies to make MDM initiative successful. Don’t approach this like an IT project, get some form of Governance initiated. Listen to Einstein’s advice, “We can not solve our problems with the same level of thinking that created them”. Get the Business involved, leverage Governance organization and Executive support!