Skip to main content

Data & Intelligence

Maximizing Data Quality for Optimum EIM Results

In the planning phases of BI implementations and Big Data projects, it is not uncommon to overlook or underestimate the importance of data quality as a critical component in the success or failure of that program.  The lack of a strategy for ensuring the quality of the customer information that is being used for the analysis can lead to the generation of results that can be incomplete, indecipherable and misleading.

 Many companies, even before a BI project starts, are already getting questionable results from whatever rudimentary analysis they are currently engaged in.  Misinterpretation of the data can exacerbate the misunderstanding of customer data even more than not gathering and analyzing the data at all.  The old saying in computing and programming of “garbage in, garbage out” clearly applies here.

 Without a solid data quality program in place, the best algorithms and execution in the world will not provide the competitive advantage or the increase in revenue and customer satisfaction that is the raison d’être of any BI initiative.  Unfortunately, it is only when a BI project is underway that the realization of the data quality problems and the resultant negative consequences takes place.

Data Intelligence - The Future of Big Data
The Future of Big Data

With some guidance, you can craft a data platform that is right for your organization’s needs and gets the most return from your data capital.

Get the Guide

Whether it is storing duplicate or conflicting information, sending customers invalid bills,  providing bad leads or incomplete information to the marketing and sales teams, or storing inaccurate personal information (names, addresses, phone numbers, etc.), all of these things can lead to lost customers and lost revenue.

 To start a data quality program, organizations must have the buy-in of upper management.  That means that the leaders need to appreciate that the quality of the organization’s data is not merely a tactical element of technology for dealing with a particular task, such as updating a data base.  Management must see this data as the life blood of their company, and maintaining its quality as a strategic initiative to enhance and validate customer information, one of their most important corporate assets.  That information is the link to their customers, and it is those customers who support the business.   Accurate information allows for the most useful analysis of the customer, and also makes the customer’s experience with the organization smoother and better, building confidence, trust and loyalty, and hopefully, future business.

 After getting the support of senior management, it is important to define what the organization wants to achieve with their data quality program, and how its success will be measured..  What specific files and data bases and which applications are to be worked with, and most importantly, what specific business problems are to be solved by these efforts?  What are the highest priority data related pains that will provide the biggest benefits, if resolved?  Also, what are the biggest data related risks to the organization that could cost the most in revenue, customer relations, customer loyalty, bad publicity, etc.?

 It is vital to ensure that the affected business units and the IT representatives are in synch on the goals, the testing methods, and the measurement of their achievement.  The business user’s buy-in (or at least cooperation) in deciding which data is good and which is bad is critical to the program’s success.  There may be some conflicts between the goals of upper management and those of the business user, resulting in the attendant political issues with which to deal.  Further, there may be disagreements as to who is actually responsible for the quality of the data as it moves through the organization.  That is why establishing the business goals and the areas of responsibilities up front are so important.

 It is most beneficial if everyone in the organization can recognize the costs of invalid and incomplete data.  There should be periodic testing and auditing to ensure that duplication, omissions, inconsistencies and corruption are not taking place.  If this can be faithfully done, it bodes well for the success of any data related projects in the enterprise information landscape, and most especially, BI.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Neetu Shaw

As Perficient's Business Intelligence (BI) Company-Wide Practice leader, Neetu Shaw provides thought leadership in developing and implementing a common BI foundational framework for Perficient and our many BI/DW clients, including common services, methods, knowledge management and an integrated enablement plan for both sales and delivery. Neetu is a business-focused and solutions-driven information management professional with executive consulting experience. Her career has been dedicated to BI consulting, thought leadership and solution sales leadership with solid experience in all phases of program implementation from initial business visioning to ROI justification through execution.

More from this Author

Follow Us