As new companies embark on the Digital Transformation leveraging Big Data, key concerns and challenges get amplified especially for the near term before the technology and talent pool supply adjusts to the demand. Looking at the earlier post Big Data Challenges, the top 3 concerns were:
Big Data Skills can be broadly classified into 4 categories:
The value creation or the monetizing of the Big Data (see Architecture needed to monetize API’s) depends on the Business and the Analytical talent. See talent gap on the right specifically in the analytical area. Educating and augmenting the talent shortage through partner companies is critical for the niche and must have technology. As tools evolve coping up with the Architecture becomes very important as past tool / platform short comings addressed with new complexities.
While business continues to search for the Big Data gold, System Integrators and Product vendors are perfecting the methods to shrink the time to market, best practices and through Modern Architecture. How much of the gap we can shrink depends on multiple factors of Companies and their partners.
See also our webinar on: Creating a Next-Generation Big Data Architecture
As companies start adapting to handle Big Data, the challenges still remains. Barring the obvious applications, the challenges of getting the value out of the new-found data continues to be on the top of the list. ROI’s and potential revenues are yet to be realized. As the technology and the usage becomes more sophisticated we will start to see the results.
From IT perspective top two challenges are Governance and Skills. Securing the Big Data for greater use within the organization is complex and the technology is evolving. Securing the right people with in-depth knowledge in managing Big Data bigger challenge. And these two aspects will feed into the bigger challenge of ‘How to get value out of Big Data’.
Organizations find that the key resources who needs to be driving the Big Data are also the same resources who are so vital to managing existing core enterprise applications. Balancing the precious time of key resources and leveraging external thought leadership / expertise is key to successful Big Data initiatives.
Identifying the strengths of the organization and prioritizing the critical areas of investment is key to successful Big Data initiatives.
IT spending is primarily focused on technologies to run the business primarily operations. With new ways of doing business, technology platforms decide the winners and losers. Typical brick and mortar versus online stores. If you look at the CIO’s budget, more than 70% goes to operational systems, infrastructure and keeping-the-lights-on-type-of applications, and the rest is spent on customer-facing applications/systems.
With Digital Transformations happening at many enterprises, the shift in IT budget is also tracking the trend. Customer experience is one of the key strategies for successful companies. With smartphones and tools for accessing information, customers are one step ahead of the traditional organizations. Investing in new technologies like Big Data, Fast Analytics and pro-active customer experience strategies through converging technologies are not just futuristic but has to be fully functional now.
CIOs are looking for ways to invest in new technologies for enhancing customer experience and leveraging the data (internal and external) to accurately deliver customer experience not just operational systems. As more and more CIOs get invited to the business leadership table, business technology investment becomes a strategic asset to manage, leverage and deliver greater customer experience. (see spending shift in CIOs face the “Age of the Customer” ).
Connect with us on LinkedIn here.
Managing data has been a challenge irrespective of the size of the company. Last couple of decades most of the companies invested in leveraging the Enterprise Data through variety of initiatives like Enterprise Data Warehouse and Business Intelligence.
If we divide the enterprise information usage in the last 2 decades, primarily they fall into the following top categories:
Usage of data is typically managed in silos and customer touch-points are not coordinated or even understood.
Today the data does not reside just within the Enterprise alone. Customers are armed with smart phones and tablets and other gadgets and are engaging through multi-channel. The amount of information available for immediate consumption and to deliver appropriate responses are a big challenge, especially if the modernization (Digital Transformation) is not addressed.
The Enterprise data continue to have its own challenges (security, growing volume and need for fast analytics). But the customer interaction data is exploding through multiple channels. Managing all these challenges means building new architecture (Tools, Applications, Platforms). Investing in Digital Transformation initiatives is key to building modern architecture, which is vital to the survival of the Enterprise. Approaching and understanding the nuances of Digital transformation should be a top priority for 2015 for any modern Enterprise.
What is broke? If I drive a pickup truck around that has a small, unobtrusive crack in the windshield and a few dings in the paint, it will still pull a boat and haul a bunch of lumber from Home Depot. Is the pickup broke if it still meets my needs?
So, when is data broke? In our legacy data integration practices, we would profile data and identify all that is wrong with the data. Orphan keys, in-appropriate values, and incomplete data (to name a few) would be identified before data was moved. In the more stringent organizations data would need to near perfect for it to be used in a data warehouse. This ideal world or perfect data was strived after, but rarely obtained. It was too expensive, required too much business buy in, and lengthen BI and DW projects. Read the rest of this post »
Everyone is guilty of falling into a rut and building reports the same way over and over again. This year, don’t just churn out the same old reports, resolve to deliver better business intelligence. Think about what business intelligence means. Resolve, at least in your world, to make business intelligence about helping organizations improve business outcomes by making informed decisions. When the next report requests land on your desk leave the tool of choice alone, Cognos in my case, and think for a while. This even applies to those of you building your own reports in a self-service BI world.
Think about the business value. How will the user make better business decisions? Is the user trying to understand how to allocate capital? Is the user trying to improve patient care? Is the user trying to stem the loss of customers to a competitor? Is the user trying find the right price point for their product? No matter what the ultimate object, this gets you thinking like the business person and makes you realize the goal is not a report.
Think about the obstacles to getting the information. Is the existing report or system to slow? Is the data dirty or incorrect? Is the data to slow to arrive or to old to use? Is the existing system to arcane to use? You know the type – when the moon is full, stand on your left leg, squint, hit O-H-Ctrl-R-Alt-P then the report comes out perfectly – if it doesn’t time out. Think about it, if there were no obstacles there would be no report request in your hands
Think about the usage. Who is going to use the analysis? Where will they be using it? How will they get access to the reports? Can everyone see all the data or is some of it restricted? Are users allowed to share the data with others? How will the users interact with the data and information? When do the users need the information in their hands? How current does the data need to be? How often does the data need to be refreshed? How does the data have to interact with other systems? Thinking through the usage gives you a perspective beyond the parochial limits of your BI tool.
Think like Edward Tufte. What should the structure of the report look like? How would it look in black and white? What form should the presentation take? How should the objects be laid out? What visualizations should be used? And, those are never pie-charts. What components can be taken away without reducing the amount of information presented? What components can be added, in the same real-estate, without littering, to improve the information provided? How can you minimize the clutter and maximize the information. Think about the flaws of write once and deliver anywhere, and the garish palates many BI tools provide.
Think about performance. Is the user thinking instantaneous response? Is the user thinking get a cup of tea and come back response time? Is the user okay kicking off a job and getting the results the next morning? If you find one of these, cherish them! They are hard to find these days. Will the user immediately select the next action or do they require some think time. Is the data set a couple of structured transactional records or is the data set a chunk of a big-data lake? Does the data set live in one homogenous source or across many heterogeneous sources? Thinking about performance early means you won’t fall into a trap of missed expectations or an impossible implementation.
Think about data quality. It is a fact of life. How do you deal with and present missing data? How do you deal with incorrect values? How do you deal with out of bounds data? What is the cost of a decision made on bad data? What are the consequences of a decision made on incorrect data? What is the cost of perfect data? What is the value of better data. Thinking about quality before you start coding lets you find a balance between cost and value.
Think about maintenance. Who is going to be responsible for modifications and changes? You know they are going to be needed. As good as you are, you won’t get everything right. Is better to quickly replicate a report multiple times and change the filters, or is it better to spend some extra time and use parameters and conditional code to have a single report server many purposes? Is it better to use platform specific outputs or is it better to use a “hybrid” solution and support every output format from a single build? Are the reports expected to be viable in 10-years or will they be redone in 10-weeks? Thinking through the maintenance needs will let you invest your time in the right areas
Think you are ready to build? Think again. Think through your tool sets capabilities and match them to you needs. Think through your users skills and match them to the tools. Think about your support team and let them know what you need. Think through your design and make sure it is viable.
Here’s to thinking better Business Intelligence throughout the year.
Everyone wants a piece of Big Data action whether you are part of Product Company, Solution provider, IT, or Business user. Like every new technology, Big Data is confusing, complex and intimidating. Though the idea is intriguing, the confusion begins when the techies start taking sides and tout the underlying tools rather than solution. But the fact is picking the right architecture (tools, platforms) does matter. It involves consideration of several aspects starting from understanding the technologies appropriate for the organization to understanding the total cost of ownership.
When you look at the organizations embarking on Big Data initiative, most organizations fall into the following 3 types.
Have experimented with several tools, multiple deployments done in multiple platforms by multiple business units/subsidiaries. Own several tool licenses, built several Data applications or experimenting currently. Many data management applications in production.
Loosely Centralized /Mostly De-centralized
Has Enterprise focus but BU’s and departmental Data applications are in use. Also several tools purchased over the years across various BU’s and departments. Many data management applications in production.
No major Data Applications
Yet to invest in major data applications. Mostly rely on reports and spreadsheets.
In all of the above scenarios, IT leaders can make a big difference in shaping the vision for embarking on a Big Data journey. Mostly Big Data projects have been experimental for many and the pressure to deliver tangible results is very high. Typically optimal tools strategy and standards takes a back seat. However at some point it becomes a priority. The opportunity to focus on the vision and strategy is easier to sell when leadership change occurs within the organization. If you are the new manager to tackle Big Data, it is your chance to use your first 90 days to formulate the strategy than get sucked into business as usual. Utilizing these moments to formulate a strategy for platform / tools standardization is not only prudent but also presents greater opportunity for approval. These strategic focus is critical for continued success and to avoid investments with low returns.
The options within Big Data is vast. Vendors with legacy products to startup companies offer several solutions. Traversing the maze of products without the help of right partners can lead to false starts and big project delays.
A few years ago, Big Data/Hadoop systems were generally a side project for either storing bulk data or for analytics. But now as companies have pursued a data unification strategy, leveraging the Next Generation Data Architecture, Big Data and Hadoop systems are becoming a strategic necessity in the modern enterprise.
Big Data and Hadoop are technologies with so much promise and a very broad and deep value proposition. But why are enterprises struggling to see real-world results from their Big Data investments? Simply put it is governance. Read the rest of this post »
Data integration has changed. The old way of extracting data, moving it to a new server, transforming it, and then loading into a new system for reporting and analytics is now looking quite arcane. It’s expensive, time consuming, and does not scale to handle the volumes we are now seeing in the digitally transformed enterprise.
We saw this coming, with push down optimization and the early incarnations of Extract Load and Transform (ELT). Both of these architectural solutions were used to address scalability.
Hadoop has taken this to the next step where the whole basis of Hadoop is to process the data where it is stored. Actually, this is bigger than Hadoop. The movement to cloud data integration will require the processing to be completed where the data is stored as well.
To understand how a solution may scale in a Hadoop or cloud centric architecture, one will need to understand where processing happens with regards to where the data is stored. To do this, one needs to ask vendors three questions:
Of course there is much more to be evaluated, however, choosing technologies that keep processing close to the data, instead of moving data to the processing will smooth the transition to the next generation architecture. Follow Bill on Twitter @bigdata73
The year 2014 just ended. 2015 is already looking like another year of data intensive initiatives. Looking at the initiatives and investments happened in 2014, Big Data will continue to be on the top, and so does the cloud.
Enterprise Data investment continues to grow as the laggards in technology are warming up to the Information governance in general. Big Data and cloud brings more compliance and security issues. While the products and offerings evolve, it becomes more important than ever to address the Data Governance.
Some of the biggest trends in information management will be:
Data Quality is the underlying thread in all of the data initiatives if not the prime reason for these initiatives. Approaching information governance like any other project creates silos of information or less trustworthy information. Having the right strategy and approach is the key for a successful implementation and transformation.
Many companies overlook the pockets of investment already made in Information Governance or stuck with limitations of using what they already have even if it is not the right approach. Creating the right vision for future state architecture leveraging the existing investment is possible only if the total cost of ownership is analyzed. If these three trends dominate, we will see more MDM systems deployed in 2015 than ever before.
Happy New Year!