Thrilling our clients with innovation and impact – it’s not just rhetoric. This belief is instrumental for our clients’ success. In 2018 we introduced our Chief Strategists, who provide vision and leadership to help our clients remain competitive. Get to know each of our strategists as they share their unique insights on their areas of expertise.
Big data has significantly impacted today’s leading enterprises “as it helps detect patterns, consumer trends, and enhance decision making.” In fact, the big data and analytics market is estimated to reach $49 billion this year with a CAGR of 11 percent.
However, big data is often too broad and complex for analysis by traditional processing application software. For businesses to get the most out of their data, they must deploy a strategy that transforms their data management and analytics practices.
We recently spoke with Bill Busch, Big Data Chief Strategist, to learn more about creating value with big data and developing a data strategy to achieve meaningful results.
Bill Busch: Since joining Perficient, I have acted as an evangelist for big data, machine analytics, resource management, and the business development of those functions. My new role as Chief Strategist is a continuation of those efforts. This role allows me to be a resource for our clients and help them gain value from strategically using their data.
What do you hope to accomplish as a Chief Strategist?
BB: I want to help clients understand how to best use their own data with AI, machine learning, analytics, and cloud to inform their business decisions. To get there, business leaders must transform data collection platforms in a way that addresses the typical mistakes or challenges experienced.
Many of our implementations involve several areas of expertise, which provides an opportunity to collaborate with other Chief Strategists, whether they’re focused in a particular industry or a specific technology.
Collectively, we help clients change processes and train teams to approach their duties differently. By addressing technology, processes, and people, they can take full advantage of [data collection] platforms’ speed and still maintain the quality and visibility of data.
“The breadth and depth of the Chief Strategists’ expertise enables the creation of comprehensive solutions that resolve our clients’ most critical issues.”
Chief Strategists in Action
A recent example of our collaboration involved developing a unified data view using APIs. Typically, application data and API management were separated when companies sought to integrate data. Now, these two ecosystems are merging into one. This presents a challenge to create a single solution that enables application integration and the consolidation of different data sources. To conquer this challenge, we partnered with Erich Roch, Chief Strategist for IT Modernization and Integration, to create comprehensive solutions for our clients.
Why does strategy matter for big data?
BB: Big data relies on a series of systems or platforms to gain meaningful insights, such as cloud data warehousing or cloud data lakes. Developing a strategy [around people and processes working with data] is what ultimately helps businesses align decision making and gain value from their data.
Why is it important for businesses to be strategic with their data?
BB: Data is the heartbeat of any modern organization. Having information readily available to make decisions doesn’t happen quickly or seamlessly. The most productive strategy to achieve speed-to-market value requires funneling your information into a database to generate high-level insights.
Data strategy also doesn’t have to end with the executive level. Businesses enable other departments for success when they make data accessible for broader analytics by following best practices, establishing predefined processes, and creating a strategy to integrate the data.
How does big data affect the ability to remain competitive?
BB: If you don’t have a data strategy in place, then you’re operating reactively and setting up your [business] for displacement. It’s easy for businesses to rely on low-quality data that is readily available to influence their decisions. However, the issue with that approach is the lack of depth of that information. Your enterprise may have the best AI in the world, but those AI models aren’t meaningful if you lack robust data for the technology to analyze.
To shift from a reactive to proactive stance, you should invest in a strategy and a data management infrastructure to manage the information. Investing in both yields deeper insights than surface-level information, provides greater context with existing knowledge, and identifies emerging trends to reveal unknown information.
When helping clients to create a strategy, we identify a scalable, analytical use case that could benefit from a data warehouse. Then, we help our clients establish a process with dedicated people and resources to assess existing data and develop ideas about the data they want to collect. From there, we create a road map to bridge the gap, so our clients are well-positioned to make strategic decisions relatively quickly.
Think Like a Chief Strategist
What trend(s) have you observed that influence how businesses manage big data?
BB: The open-source approach to data processing is new. Companies previously used on-premise data lakes for their processing, and then the trend shifted to cloud-driven data lakes. Now, we’re migrating data away from mainframes and high-dollar, appliance-based platforms to open source, cloud-based platforms.
For example, we’re currently working with a major healthcare payer to migrate its costly claims processing from a data lake mainframe to an open source, cloud-based platform. This project will enable our client to do more with its data – at a faster pace – while reducing costs. I think many companies realize the possibilities and are considering this model and similar solutions. It’s a sign of the times and underscores the need for digital transformation.
However, if your organization isn’t quite ready for data processing in the cloud, data lakes are a great intermediate step, allowing you to consolidate your data for the analytical use case. Using a cost-effective enterprise data warehouse (EDW), you can take advantage of the data lake and use the data for operational processing.
While I’m not saying that we’re moving away from data lakes, I think we’re seeing a need to build out the reliance of data lakes and enable more than one use case. By doing so, we’re architecturally enabling multiple areas of a business to take advantage of data that suits its goals or objectives.
When creating a big data strategy with clients, what are the top considerations?
BB: At the start of any client engagement, we identify the business drivers for a data strategy. Whether the use case is analytical or scientific, the supporting data architecture and data strategy must have some perspective to the [business] need. We also ensure the business need isn’t too tactical so future iterations aren’t limited.
Organizational culture is another major consideration for developing a big data strategy. Enabling self-service within the finance industry is a prime example. Industry regulations impact the level of self-service capabilities that a firm can offer. These limitations can greatly influence a financial organization’s culture, and its reception of a data-focused strategy.
Contrast this scenario to the life sciences industry that’s comprised of people who are hard-wired to interact with data in a more analytical way. Regardless of the industry or the culture, incorporating an organizational change management component helps ease the transition.
Using these insights, I try to establish a vision with clients. Many issues that businesses generally encounter when creating a data strategy and data lake platform isn’t related to the technology. Instead, it’s overcoming the mindset of the people involved. If you start with a universal outlook, develop agile concepts, and deploy lean processes, a data strategy can inform much analysis upfront. Then, the strategy establishes a data pipeline to begin moving information from the source to its destination in a relatively short timeframe.
Learn more about each of our Chief Strategists by following this series.