Perficient Business Intelligence Solutions Blog

Blog Categories

Subscribe via Email

Subscribe to RSS feed

Archive for the ‘Emerging BI Trends’ Category

“Accelerate your Insights” – Indeed!

I have to say, I was very excited today as I listened to Satya Nadella describe the capabilities of the new SQL 2014 Data Platform during the Accelerate your Insights event. My excitement wasn’t tweaked by the mechanical wizardry of working with a new DB platform, nor was it driven by a need to be the first to add another version label to my resume. Considering that I manage a national Business Intelligence practice, my excitement was fueled by seeing Microsoft’s dedication to providing a truly ubiquitous analytic platform that addresses the rapidly changing needs of the clients I interact with on a daily basis.

If you’ve followed the BI/DW space for any length of time you’re surely familiar with the explosion of data, the need for self-service analytics and perhaps even the power of in-memory computing models. You probably also know that the Microsoft BI platform has several new tools (e.g. PowerPivot, Power View, etc.) that run inside of Excel while leveraging the latest in in-memory technology.

PeopleDataAnalytics But… to be able to expand your analysis into the Internet of Things (IoT) with a new Azure Intelligent Systems Service and apply new advanced algorithms all while empowering your ‘data culture’ through new hybrid architectures…, that was news to me!

OK, to be fair, part of that last paragraph wasn’t announced during the key note, it came from meetings I attended earlier this week and that I’m not at liberty to discuss, but suffice it to say, I see the vision!

What is the vision? The vision is that every company should consider what their Data Dividend is.


DataDividend
Diagram: Microsoft Data Dividend Formula

Why am I so happy to see this vision stated the way it is? Because for years I’ve evangelized to my clients to think of their data as a ‘strategic asset’. And like any asset, if given the proper care and feeding, you should expect a return on it! Holy cow and hallelujah, someone is singing my song!! :-)

What does this vision mean for our clients? From a technical standpoint it means the traditional DW, although still useful, is an antiquated model. It means hybrid architectures are our future. It means the modern DW may not be recognizable to those slow to adopt.

From a business standpoint it means that we are one step closer to being constrained only by our imaginations on what we can analyze and how we’ll do it. It means we are one step closer to incorporating ambient intelligence into our analytical platforms.

So, in future posts and an upcoming webinar on the modern DW, let’s imagine…

Qlik leadership – vision, guts and glory… hopefully

“It is not the strongest of the species that survives, nor the most intelligent that survives. It is the one that is most adaptable to change.” – Supposedly Darwin from ‘Origin of Species’… or NOT

According to the most recent report from Gartner, no one vendor is fully addressing the critical space in the market for “governed data discovery”. Governed data discovery means addressing both business users’ requirements for ease of use and enterprises’ IT-driven requirements – nobody is really doing that. So, who will be the most adaptable to change and embrace the challenges of an ever-changing and increasingly demanding BI and Analytics market?

Qlik leadership – vision, guts and glory… hopefullyThis year, Qlik plans to release a completely re-architected product – QlikView.Next – that will provide a new user experience, known as ‘Natural Analytics’. The lofty goal of QlikView.Next and this Natural Analytics approach is to provide both business-user-oriented and IT-friendly capabilities. According to Gartner, this approach has ‘the potential to make Qlik a differentiated and viable enterprise-standard alternative to the incumbent BI players.’

Will QlikView.Next be able to deliver the combination of business user and IT capabilities that are currently lacking in the market? Will Qlik be able to reinvent itself with Natural Analytics and deliver the “governed data discovery” solution that the market needs so desperately? Only time will tell; however Qlik is definitely showing all the traits of a real leader in the BI and Analytics space – once again, setting the bar pretty high. The vision and the guts are definitely there and accounted for. Will glory follow? That will depend on execution and delivery.

However, QlikView.Next is more than a year behind its scheduled release… so I guess we’ll have to rely on past behavior for now. Back in 2006, when Qlik carved its space on Gartner’s Magic Quadrant for Analytics and BI platforms (BusinessWire article), it positioned itself right in the ‘Visionary’ quadrant and it has been delivering on its vision ever since. For about eight years, Qlik has been delivering on its vision for Business Intelligence, i.e. user-driven BI – Business Discovery. Given this track record, I’ve reasons to believe that Qlik will be able to deliver on its vision once again.

I also believe that leadership is all about having a vision, along with the guts and ability to execute on that vision. That is probably one of the reasons why Gartner came up with quadrants that organize technologies along two dimensions – ‘Completeness of Vision’ and ‘Ability to Execute’. For the past few years, thanks to its ability to execute and deliver on its vision, QlikView has been able to work its way to the leaders quadrant and secure its position (GMQ 2014) – by demonstrating excellence in both vision and execution. So, how is Qlik planning on executing on its vision over the next few months – what’s .Next?

Well, there are several features worth mentioning… but, we’d be able to review only a few here, namely:

Read the rest of this post »

Three Big Data Best Practices

One of the benefits of the Hadoop is its ability to be configured to address a number of diverse business challenges and integrated into a variety of different enterprise information ecosystems.  With proper planning these analytical big data systems have shown to be valuable assets for companies.  However, without significant attention to data architecture best practices this flexibility can result in an crude April Fool’s joke resulting in a system that is difficult to use and expensive to maintain.

Three Big Data Best PracticesAt Perficient, we typically recommend a number of best practices for implementing Big Data. Three of these practices are:

  1. Establish and Adhere to Data Standards – A data scientist should be able to easily find the data he/she is seeking and not have to worry about converting code pages, changing delimiters, and unpacking decimals.   Establish a standard and stick to it then convert the data to the standard encoding and delimiter during the ingestion process.
  2. Implement a Metadata Configured Framework – Remember when ETL was all hand-coded?   Don’t repeat the sins of the past and create a vast set of point to point custom Sqoop and Flume jobs. This will quickly become a support nightmare.   If the costs of a COTS ETL tool are prohibitive, then build a data ingestion and refining framework of a small number of components that can be configured using metadata.   The goal for a new data feed to be added by configuring a few lines of metadata versus scripting or creating code for each feed.
  3. Organize Your Data – This practice may seem obvious, however, we have seen a number of Hadoop implementations that look like a network file share vs. a standards driven data environment.   Establish a directory structure that allows for the different flavors of data.   Incremental data (aka delta’s), consolidated data, data that transformed, user data, and data stored in Hive should be separated by into different directory structures.   Leverage a directory naming convention; then publish the standard so that data scientists/users can find the data they are seeking.

Addressing these three best practices will ensure that your Big Data environment is usable and maintainable.   If you are implementing or considering a Big Data solution, Perficient has the thought-leadership, partnerships, and experience to may your Big Data program a success.

QlikView… QlikTech… Qlik…

Several years ago, when I started using QlikView (QlikTech’s flagship product), I had a strong preference for more traditional BI tools and platforms, mostly because I thought that QlikView was just a visualization tool. But after some first-hand experience with the tool, any bias I had was quickly dissipated and I’ve been a QlikView fan and fulfilling the role of Senior QlikView Architect on full lifecycle projects for a while now.

QlikViewToday, Qlik Technologies (also known as QlikTech or simply Qlik) is the 3rd fastest growing tech company in the US (according to a Forbes article) but my personal journey with QlikView, and probably QlikTech journey as well, has not always been easy – a paradigm shift in the way we look at BI is required. Most importantly, I understood along with many others, that this isn’t a matter of QlikView or SAP BI, of QlikView Agile approach to BI or Traditional BI – it is NOT a matter of ORs, but rather a matter of ANDs.

It is a matter of striking the right balance with the right technology mix and do what is best for your organization, setting aside personal preferences. At times QlikView may be all that is needed. In other cases, the right technology mix is a must. At times ‘self-service’ and ‘agile’ BI is the answer…. and at times it isn’t. Ultimately, it all revolves around the real needs of your organization and creating the right partnerships.

So far, QlikTech has been able to create a pretty healthy ecosystem with many technology partners, from a wide variety of industries and with a global reach. QlikTech has been able to evolve over time and has continued to understand, act on and metabolize the needs of the market, along with the needs of end-users and IT – I wonder what’s next.

That’s one of the reasons why Qlik has been able to trail-blaze a new approach to BI; user-driven BI, i.e. Business Discovery. According to Gartner ‘Qlik’s QlikView product has become a market leader with its capabilities in data discovery, a segment of the BI platform market that it pioneered.’

Gartner defines QlikView as ‘a self-contained BI platform, based on an in-memory associative search engine and a growing set of information access and query connectors, with a set of tightly integrated BI capabilities’. This is a great definition that highlights a few key points of this tool.

In coming blogs, we’ll explore some additional traits of QlikTech and its flagship product QlikView, such as:

Ø  An ecosystem of partnerships – QlikTech has been able to create partnerships with several Technology Partners and set in place a worldwide community of devotees and gurus

Ø  Mobility – QlikView was recently named ‘Hot Vendor’ for mobile Business Intelligence and ranks highest in customer assurance (see WSJ article here) with one of the best TCO and ROI

Ø  Cloud – QlikView has been selected as a cloud-based solution by several companies and it has also created strong partnerships with leading technologies in Cloud Computing, such as Amazon EC2 and Microsoft Azure

Ø  Security – provided at the document, row and field levels, as well as at the system level utilizing industry standard technologies such as encryption, access control mechanisms, and authentication methods

Ø  Social Business Discovery – Co-create, co-author and share apps in real time, share analysis with bookmarks, discuss and record observations in context

Ø  Big Data – Qlik has established partnerships with Cloudera and Hortonworks. In addition, according to the Wall Street Journal, QlikView ranks number one in BI and Analytics offering in Healthcare (see WSJ article here), mostly in connection with healthcare providers seeking “alternatives to traditional software solutions that take too long to solve their Big Data problems”

 

In future posts, I am going to examine and dissect each of these traits and more! I am also going to make sure we have some reality checks set in place in order to draw the line between fact and fiction.

What other agile BI or visualization topics would you like to read about or what questions do you have? Please leave comments and we’ll get started.

Three Attributes of an Agile BI System

In an earlier blog post I wrote that Agile BI was much more than just applying agile SDLC processes to traditional BI systems.  That is, Agile BI systems need to support business agility.   To support business agility, BI systems should address three main attributes:

  1. Usable and Extensible –  In a recent TDWI webinar on business enablement, Claudia Imholf said “Nothing is more agile than a business user creating their own report.”   I could not agree more, with Ms. Imholf’s comments.   Actually, I would go farther.  Today’s BI tools allow users to create and publish all types of BI content like dashboards, and scorecards.  They allow power users to conduct analysis and then storyboard, annotate, and interpret the results.   Agile BI systems allow power users to publish content to portals, web-browsers, and mobile devices.  Finally, Agile BI systems do not confine users to data published in a data warehouse, but allow users to augment IT published data with “user” data contained in spreadsheets and text files.  Read the rest of this post »

Top 5 Best Practices for an Actionable BI Strategy

In an earlier blog post, I pointed out a number of companies complete a BI Strategy but only to shelve it shortly after its completion.   One main reason is that companies let their BI Strategy atrophy by not maintaining it; however, the other main cause of shelving a BI Strategy is that it was not actionable. That is, the BI Strategy did not result in a roadmap that would be funded and supported by the organization.  As you formulate your BI Strategy, there are 5 best practices that will help result in a BI Strategy that is actionable and supported by your business stakeholders.  These best practices are:BI Strategy

1.       Address the Elephants in the Room – Many times if management consultants are brought into help with a BI Strategy, their objectivity is needed to resolve one or more disagreements within an organization.   For example, this disagreement could be a DW platform selection, architectural approach for data integration, or the determination of business priorities for the BI program.   The BI Strategy needs to resolve these issues or the issues will continue to fester within the organization, eventually undermining the support for the BI Strategy.    Read the rest of this post »

Business Intelligence – Implementation Challenges

Business Intelligence should help organizations improve business outcomes by making informed decisions. The problem is that Business Intelligence is the overarching term applied to the tools, technologies, and best practices that that supposedly help organizations make sense of data. Where should you start? What tools should you use? What are the best practices? How do you manage the mass of data flowing into your organization? To which buzzwords should you pay attention? Perficient’s Enterprise Information Solutions group helps organizations determine how to put business and intelligence back into Business Intelligence.

In previous posts, I looked at Business Intelligence and what it means today and Business Intelligence – Future trends. In this post, I look at implementation challenges and how they might be addressed. The current state of business intelligence and the future trends, present a significant set of challenges to organizations trying to improve and leverage the data they have. Perficient’s Enterprise Information Solutions Company Wide Practice helps organizations do just that. The challenges that organizations face fall into a number of key areas.

In many Business Intelligence environments, a big challenge faced has nothing to do with technology but is to do with the organization itself. How does the organization enable cross organization collaboration? How are business sponsors recruited and engaged? How is dedicated business representation aligned and maintained? How is the focus on the top priorities established and preserved throughout? What is the right team and how is it put in place? Enterprise Information Management (EIM) Strategy Sessions help organizations achieve confidence in the data and information they produce and use it to improve top-line performance. The EIM Strategy Sessions should drive out the following actions in order to prepare for future success.

  • Developing and aligning the vision with the business’ mission, goals and priorities
  • Mapping business drivers to analytical needs and capabilities
  • Identifying opportunities for creating new business value
  • Documenting the common themes of business pain and obstacles, and identifying comprehensive solutions
  • Assessing organizational and technical readiness
  • Generating a business-driven roadmap
  • Clarifying and communicating the strategic direction, and establishing a program sponsor
  • Establishing the initial return on investment justification

Once an organization understands what they want Business Intelligence to provide, the next challenges become more technical. How should they manage the exploding user base? How should the increasing volumes of data be handled? How should all the new and different types of data be controlled? How should the integration of new data sources be controlled? Central to managing these items is a solid, well-planned architectural foundation. Many Business Intelligence environments have evolved organically but they typically cannot advance beyond a certain point. What is needed to establish a platform the will perform, scale, and adapt to future needs is a Foundational Solution Architecture design. The process to establish this includes:

  • Creation of a conceptual model that supports the business needs and objectives mapping business requirements to architectural capabilities
  • Mapping current capabilities to the conceptual model identifying gaps to be addressed and filled
  • Specifying a best practice based architectural solution that will address organization’s business needs and requirements
  • Mapping application components onto the platforms and infrastructure systems to create a complete solution architecture
  • Identify the key entities, tools, and technologies needed for initial delivery phases and the minimum value set
  • Adding detail to the return on investment plans in order to justify moving forward

The strategic objectives and tactical components identified lead to the next hurdles for an organization to face. How should cost be controlled in order to meet the return on investment justification? How can business value be created quickly? How can target delivery dates be guaranteed? How should shifting priorities and business needs be managed? The best way to meet these challenges is to establish a formal Business Intelligence Program. The program can leverage centers of excellence for data governance, information deliver, and such, or it can be fully contained and manage the whole process. The Business Intelligence Program should:

  • Stand up program governance by establishing a program governance team, protocols, and charter
  • Build a high-level, iteration-based program plan based on priorities, ease of implementation, and time to value
  • Determine scope, deliverables, and success criteria for each iteration
  • Determine the detailed plan, resource requirements, and costs for the initial and subsequent iteration
  • Communicate the plans and gain approval to move into the implementation phase

Once the key initiatives identify and plans in place, it is time for the organization to translate strategy into action. Implementation should follow an organizations typical implementation cycle whether that is waterfall, agile, or hybrid. The methodology should be rigorous, planned, and monitored. It should include the following activities:

  • Project management and progress reporting
  • Detailed solution design
  • Database, data warehouse, data mart implementation
  • Data integration, data cleansing, data transformation services
  • Extract, Transform and Load (ETL), Report, and Dashboard development
  • Predictive model development
  • Testing
  • End-user training, coaching, enablement, and support
  • Support-staff training and mentoring

Perficient’s Enterprise Information Solutions Company Wide Practice helps organizations identify and tackle all of the areas discussed above. We apply a business focused, technology agnostic approach to help organizations rapidly realize business value.

The Case for Data Virtualization (aka IaaS)

I remember several years ago when I was working with a company’s CIO. His comment to me was, “Why do I need to move any data or create a data warehouse? Why can’t I just virtualize all of my operational data from its source and use it where it is? That way, I get real-time results and a lot of flexibility when our business requirements change.”

This may be a typical statement / question from a business executive and I’m sure most of you have heard something like this before. Why can’t we just virtualize all of our operational data through integration rules and business logic? We may be able to, but it wouldn’t be the right answer. The right answer is, it depends on the use case.

Forrester view of Data Virtualization (also known as Information-as-a-Service) is “Data virtualization has many use cases including providing a single version of the truth; enabling real-time business intelligence (BI), enterprise wide search, or high-performance scalable transaction processing; exposing big-data analytics; federating views across multiple domains; improving security and access; integrating with cloud and partner data and social media; as well as delivering information to mobile apps”.
Data virtualization is making data look like it is really there physically, when in reality, it is just code in the form of integration rules, data quality rules, formatting rules, match/merge rules, etc….

Here are some examples where virtualizing data may be warranted:

  • Business needs to instantly combine & access fresh & accurate data
  • Complete view of the business is required on-demand
  • Business needs to be agile to innovate
  • High-value operational reporting is critical
  • Complex and heterogeneous IT environment
  • Data Mart proliferation

According to Forrester, in their “Wave of Data Virtualization, Q1 2012, Informatica is the leading provider of the data virtualization solution. Their solution is called Data Services and it was built around the same concepts of its data integration solution (PowerCenter) and their data quality solution.

What differentiates their solution from some of the others is that data integration rules and data quality rules can be built in their Data Services (virtualization) solution and then used in their data integration solution (PowerCenter).

So what this means is that a company can leverage the benefits of data virtualization for rapid development and deployment of a virtual data integration solution and if or when it becomes a solution that requires instantiation of a physical data structure (e.g. Data Warehouse), it can easily be done without code rewrite. Basically the rules that have been developed with data virtualization are reused for population of a physical data structure.

Informatica Data Services provides a single environment for data integration and data federation along with role-based tools that share common metadata. It allows analysts to access and merge data directly across systems and to collaborate with IT to create sophisticated business rules that leverage the data profiling, complex transformations, data quality, and data masking capabilities of the Informatica platform. With Informatica Data Services, a company benefits from a single scalable architecture for both data integration and data federation, creating a data virtualization layer that hides and handles the complexity of accessing underlying data sources—all while insulating them from change. As a result, analysts get the data they need and trust while IT retains control of the process. IT can deploy data services that can be instantly reused for all applications without rework.

Informatica Data Services offers on-the-fly data quality and profiling, a model-driven approach to provisioning data services, performance enhancements, cloud integration, common metadata, and role-specific tools.

Forrester Wave of Data Virtualization

 Forrester Wave

What is Agile Business Intelligence?

Agile Business Intelligence is a term that is thrown around a lot.  It is interesting to ask people  “What is Agile BI.”   In general, since I work with mostly technology people I get the answer like “agile BI is the application of agile SDLC methodologies like SCRUM to deliver BI capabilities to business stakeholders.”  To that question I usually ask a follow-up question:

Do you really think when your company is not making its sales numbers and a business analyst sits down to answer the “Why” question, that he really cares if a SCRUM process was used to delivery his BI application? 

It does not take long for the light bulb to come on.   The reality is most users care about the flexibility of the BI system not how the system was delivered.   The reason is simple;  a flexible, responsive BI system helps an analyst do his job easier.   The value of an agile SDLC methodology is that they reduce the risk in the delivery process.  Once a system is delivered this reduction of risk has been realized.  It is of little value to the business user when he is trying to use a system to answer a critical business question.

More questions.  In your current BI environment what is process to get a million row table added to your data warehouse?   Do you require a project?  Does this project have to go through a funding mechanism, business case development, then through the typical requirements driven delivery process  (either agile or waterfall) that includes conceptual, logical, physical modeling, ETL, semantic layer design, unit, integration, system, and user acceptance testing?    Yikes, what is agile about that?

It is no wonder data discovery tools, like Qlikview, Tableau, and Spotfire (to name a few) are lighting it up in the BI market place.   These tools all allow users to take large tables (in spreadsheets, Microsoft Access, staging areas or flatfiles) and integrate them with IT “published” data.   Once the data is integrated they can perform the analysis to answer the “Why” question.  No extended funding cycles.  No extended delivery lifecycles.  No IT involvement.   This “agility” is what your users desire and require to enable an agile company.  Time is money and these answers cannot wait for a lengthy delivery funding and delivery cycle.   In other words, if your bonus is predicated on your company making its sales numbers, do you want your business users to have to go through an extended process to answer the “Why” question to maximize sales?

Business Intelligence – Future Trends

Business Intelligence should help organizations improve business outcomes by making informed decisions. The problem is that Business Intelligence is the overarching term applied to the tools, technologies, and best practices that that supposedly help organizations make sense of data. Where should you start? What tools should you use? What are the best practices? How do you manage the mass of data flowing into your organization? To which buzzwords should you pay attention? Perficient’s Enterprise Information Solutions group helps organizations determine how to put business and intelligence back into Business Intelligence.

In a previous post, I looked at Business Intelligence and what it means today. In this post, I examine some of the trends we are seeing today.

Many of the complaints that echo around the halls of Business Intelligence relate to the lack of agility and responsiveness of IT driven implementations. To address this, users will increasingly gravitate to Business Intelligence tools that allow Data Discovery. These tools have no hard and fast, rigid data sources and structures. They allow the end-user to quickly plug-in, model, and analyze new data sources while still leveraging enterprise metadata and data.

As the Internet has grown, so has a user’s expectation of simple mechanisms for finding information. Search-based business intelligence tools will become the norm allowing users to bring together both structured and unstructured data using search terms. Search-based business intelligence tools have a “Google-like” interface allowing users to explore data with little formal training; they gather data from disparate sources with little need for a previously constructed semantic layer; using RAM and specialized indexing helps to improve the performance of queries. The user interfaces of these tools use text and natural language to help users find the information they need.

Another consumer trend that will drive Business Intelligence forward is social media and collaboration. Collaborative Business Intelligence allows users to find, discuss, and rank the data, reports, and analyses that they find the most useful. They have simple portal based interfaces and rely on the search capabilities previously discussed. The tools provide ranking mechanisms and recommendation based on a user’s previous consumption, “likes”, and profile. The tools allow users to share information, ask for feedback, provide commentary, and be notified based on the preferences set

The previous three trends all point towards Business Intelligence’s Holy Grail, Self-service Business Intelligence. IT will continue to provide the platforms and integration components for Business Intelligence. IT will also be heavily involved in the mass distribution of standardized information. Business users and analysts will become explorers and “data scientists” looking for the insights Business Intelligence can provide.

Taking the current real-time monitoring of Dashboarding into the future is Operational Intelligence. Complex event processing is used to combine data from multiple sources to identify events or patterns. These are fed through the Operational Intelligence applications to analyze them and respond to them in near real-time. Operational intelligence allows organizations to identify anomalies, opportunities, and threats initiating the “best next action” for optimal business impact. Operational Business Intelligence allows organizations to enable Pervasive
Business Intelligence with Context
Awareness. That is front-line employees are enabled with Business Intelligence that is directly connected to the applications they are using without the need to formulate queries or request information.

Organizations will continue to adopt and deploy Mobile Business Intelligence. The latest tools will allow them to move beyond delivery of simple descriptive Business Intelligence, and on to full interaction. Users will be able to explore the data on their device drilling up and down, and slicing and dicing the data. Mobile Business Intelligence will allow remote users, and those away from their desk, to gain access to information wherever they are and make decisions immediately. The debate will continue as to whether to deliver in native or web-based applications. No matter what, interactivity is the true key to Mobile Business Intelligence.

As it always has, Business Intelligence will depend on the consumption of data. This includes the consumption of Big Data. No matter how many “V”s come to define Big Data the initial three volume, variety, and velocity will always play a part. The standard tools will continue to be adapted to ingest data from the likes of Hadoop, Cassandra, and BigQuery. Business Intelligence solutions that do not include this capability will quickly be superseded by those that do.

Enterprises will adopt Cloud Business Intelligence in its many forms. Some organizations will adopt the Software-as-a-Service model simply using the provider’s Business intelligence applications running in the cloud. Other organizations will adopt the Platform-as-a-Service model leveraging the provider’s platform to build their own Business Intelligence applications. Yet other organizations will leverage the Infrastructure-as-a-Service model, deploying their own platforms and applications on top of the hosting infrastructure. Initially, the focus will be on consuming cloud based data sources and rapid deployment of development environments but as comfort and security protocols improve, more internal data and critical applications will be moved to the cloud.

Analytic Appliances will feature heavily in future Business Intelligence implementations. Analytic Appliances bring together multiple tools and technologies into a single highly integrated and optimized machine. Analytic Appliances provide users with transparent access to multiple data sources, including historical data warehouses and real-time operational database, Big Data sources, etc. allowing them to perform in depth analysis. Often Analytical Appliances offer pre-packaged ready-to-run analytical functions such as digital marketing optimization, social network analysis, fraud detection, and financial analysis.

Prescriptive Analytics moves Predictive Analytics into the future. It helps organizations do decide the best action to take based on the current situation, the business’ requirements and goals, and any constraints that exist. Prescriptive Analytics takes in structured and unstructured data and uses business rules along with mathematical and computational models to predict what lies ahead and prescribe how to take advantage without compromising other objectives. Prescriptive Analytics continuously and automatically tries to anticipate the what, when, and why of unknown future events.

I will close out this trilogy up with another post on how Perficient is helping organizations achieve these Business Intelligence objectives.