Shiv Bharti, Author at Perficient Blogs https://blogs.perficient.com/author/sbharti/ Expert Digital Insights Mon, 09 Sep 2019 15:22:02 +0000 en-US hourly 1 https://blogs.perficient.com/files/favicon-194x194-1-150x150.png Shiv Bharti, Author at Perficient Blogs https://blogs.perficient.com/author/sbharti/ 32 32 30508587 Oracle Simplifies Branding and Packaging of its Analytics Product https://blogs.perficient.com/2019/06/26/oracle-simplifies-branding-and-packaging-of-its-analytics-product/ https://blogs.perficient.com/2019/06/26/oracle-simplifies-branding-and-packaging-of-its-analytics-product/#comments Wed, 26 Jun 2019 14:49:06 +0000 https://blogs.perficient.com/?p=241462

Oracle simplifies its cloud, on-premises and prebuilt analytics applications under a single brand ‘Oracle Analytics’. At the analytics summit this week, Oracle announced a new customer-centric vision for its Analytics product. Under the new product direction, 18+ products which include Oracle Analytics Cloud, on-premises platforms like OBIEE, Oracle BI Applications among others will be consolidated under a single brand ‘Oracle Analytics’.

There are essentially only three products for customers to choose from:

  1. Oracle Analytics Cloud – For customers considering cloud analytics platform and looking to leverage AI-powered, self-service analytics functionality for data preparation, visualization, enterprise reporting, augmented analysis, and natural language processing/generation, Oracle Analytics Cloud would be the platform to chose. Autonomous Data Warehouse along with Oracle Data Integration platform would allow customers to build a standalone analytics platform as well as a pop-up data mart for business users.
  1. Oracle Analytics Server – Oracle recognizes that there are several on-premises Oracle Business Intelligence customers on OBIEE who are not considering migrating to cloud and would like to leverage augmented analytics capabilities, Oracle Analytics Server meets these requirements. Those customers will now be able to take advantage of augmented analytics and world-class data discovery capabilities as part of Oracle Analytics Server, with no additional cost for the upgrade.
  1. Oracle Analytics for Applications – Oracle BI Applications, the prebuilt analytics for Oracle ERP customers was adopted by a large number of Oracle ERP customers. It was also a very successful analytics product for Oracle. Based on that successful strategy, as a part of Oracle Analytics for applications, Oracle will roll-out personalized business application analytics including benchmarks and machine learning-driven predictive insights. All line-of-business users gain real-time access to prebuilt, auto-generated business content and insights, starting with solutions for Fusion ERP then expanding to Fusion HCM, SCM, CX, and NetSuite.

With the new pricing strategy, there will no longer be a barrier for customers considering Oracle Analytics Platform.  Customers will have an option to chose by the user (Professional) or OCPU (Enterprise). Both prices are competitive with low cost and leading analytics vendors. I believe the new strategy will allow Oracle to retain its existing OBIEE, Oracle BI Applications customer base, but also take some share from other leading vendors. The simpler product range in this new direction will make it easier for customers to choose from and will likely lead to more adoption and sales for Oracle Analytics.

]]>
https://blogs.perficient.com/2019/06/26/oracle-simplifies-branding-and-packaging-of-its-analytics-product/feed/ 1 241462
Welcome to the Period of Augmented Analytics https://blogs.perficient.com/2019/02/26/welcome-to-the-period-of-augmented-analytics/ https://blogs.perficient.com/2019/02/26/welcome-to-the-period-of-augmented-analytics/#respond Tue, 26 Feb 2019 16:06:34 +0000 https://blogs.perficient.com/?p=236887

According to Gartner, Augmented Analytics Is the future of Data and Analytics – the next disruption in Analytics and Business Intelligence. It’s the capability of automating insights using machine learning and natural-language generation. If we look back over the last two decades, there are three different waves in the Analytics and Business Intelligence space.

Analytics Industry Waves

The first wave of disruption was 25-30 years ago, when organizations were building centralized data warehouse platforms. At that time, analytics platforms were expected to provide the ability to access predefined dashboards and reports via a common semantic layer, ensuring a single source of truth for all the reports and metrics across the enterprise. A data model was designed to fulfill a well-outlined set of business requirements. Similarly, data consolidation largely involved pulling data from ERP and CRM platforms, mostly relational table structures. It was very time consuming and involved a lot of manual effort even to automate the data consolidation and cleansing process.

During this period, looking back at what had happened, with biases around what we should be looking at and how we should analyze our data was the norm, since users had to outline exactly what they needed to see before the start of a project.

Also, this was very IT-centric and required a long turnaround to fulfill any new insights requested by business users in order to make a timely decisions.

Then came the second wave of disruption, which was about a decade ago. The gold rush by various players in the market was to fulfill the void that existed within the business user community – the need to empower the business users, provide more self-service and ad-hoc analytics capabilities.

Empowering users by allowing them to create their own visualizations, mashup data and create their own reports by connecting to various sources. All, without the need to learn any kind of coding. Essentially, helping democratize self-service analytics.

There was one challenge though that continued to grow, the complexity of data structures. The type and speed at which the data complexity continued to grow was exponential. That led to complexity of not being able truly empower all business users.

Which brings us to the third wave of disruption, the period of augmented analytics. This is the period we are in now and for the next decade. In this wave, we rely on machines and algorithms to do the job for us.

  • The job of understanding which data sets to connect to and to automatically establish the connection.
  • The job of preparing the data for us without any manual intervention.
  • The job of scaling the environment to support the performance needs so it can complete its own tasks.
  • Most importantly, the job of finding the insights at the speed that will have an impact on business.

Data Visualization Does Not Equal Analytics

Data Visualization alone is not analytics. It does empower users, you need more depth beyond that. With so much data today, there are way too many variables and relationships that exist between them, which makes it practically impossible to manually process and get insights. With digital transformations going on – the speed, volume, and variety of data is not getting any smaller, it only continues to grow exponentially.  The modern business analytics platforms need to understand these growing data complexities and simplify the process of getting timely insights into the hands of the decision makers.

Lets explore a very simple HR use case of finding insights into employee attrition.  We need capabilities to not only build visualizations to tell a data story, but we also need the machine to find the correlation and tells us where there is a cluster based on employee profiles that we are not thinking about, but has powerful insights. For example, with Oracle Analytics Cloud, you can certainly analyze all your workforce profile data, but you can also ask the platform to, ‘Explain Attrition’ where it does the job of automating the process of connecting all your HR data, finding relationships with each other, and discovering powerful insights into employee attributes and behaviors that directly impact attrition – all without writing a single line of code.  Take it even further and use the tool to make predictions based on historical data and all these powerful insights with lightning speed. This goes beyond visualization and has much more depth.

Augmented Analytics, one of the key strengths of Oracle Analytics Cloud (OAC)

Oracle is leading the wave of augmented analytics by democratizing machine learning for everyone. Not just data scientists or programmers.  With powerful machine learning, natural language query and autonomous data warehouse capabilities, Oracle Analytics Cloud can do the job of finding powerful insights for you and your business. In Gartner’s 2019 Magic Quadrant for Analytics and Business Intelligence Platforms, Oracle Analytics Cloud was mentioned as the innovator in the space of Augmented Analytics which really is machine learning, AI technologies, natural language processing and autonomous data management. Other strengths of Oracle Analytics is cloud presence with optimizations and integrations to Oracle enterprise applications. Also, product vision around augmented analytics is cited as one of the key strengths.

With this platform, you can continue to get a single source of truth for any static dashboards and reports. You can continue to empower your users with performing self-service analytics and data visualization, along with simplifying the process of data consolidation by leveraging leverage machine learning to automate data preparation and insights generation.

Many users are not aware of what’s possible with OAC and also how to architect this new platform and go about it. This is evidently clear, coming out of the OAC Workshops that we have been running as well as number of assessments and strategy engagements we are driving right now.  We are seeing very positive feedback and the adoption is growing as users become aware of these augmented analytics capabilities around machine learning and data preparation as well as data visualization on OAC.

With Oracle Analytics Cloud and its autonomous data warehouse capabilities, you can today, not tomorrow, not in the future, but today, right now, leverage the powerful machine learning capabilities of automating data preparation, automating insights generation, using natural language to interact and provide you with the valuable and timely insights.

]]>
https://blogs.perficient.com/2019/02/26/welcome-to-the-period-of-augmented-analytics/feed/ 0 236887
Reflections on Hosting Multiple Oracle Analytics Cloud Workshops https://blogs.perficient.com/2019/02/07/reflections-on-hosting-multiple-oracle-analytics-cloud-workshops/ https://blogs.perficient.com/2019/02/07/reflections-on-hosting-multiple-oracle-analytics-cloud-workshops/#respond Thu, 07 Feb 2019 19:23:05 +0000 https://blogs.perficient.com/?p=235774

Our team has been partnering with Oracle on running Oracle Analytics Cloud (OAC) Workshops all across North America. To date we have partnered in running more than 10 workshops, the most recent one was held yesterday in Houston, TX. We had a record number of attendees, more than double the capacity we had planned for. I would like to give a big shout out to both Perficient and Oracle Teams – Alliance, Marketing, and Sales Consultants in pulling off these very successful workshops. Also, a big thank you to our customers in carving out time to attend these four-hour sessions.

These workshops allow our customers, both business and IT, to test-drive the powerful capabilities of a next generation business analytics platform. From building cool visualizations to mashing-up data from multiple sources to using Machine Learning to cleanse and structure the data and building predictive models. We also explore building Essbase cubes via spreadsheets and using it as a source to mashup and build very interactive and dynamic visualizations. There are always ‘Aha’ and ‘Wow’ moments as the attendees explore new capabilities and functionalities of this powerful, cloud analytics platform.

In this post, I would like to reflect on my conversations with customers, commonly asked questions, and share feedback from our customers on the capabilities they think will have an immediate impact in their organization.

Oracle Analytics can do that?

Common across all the Workshops, I get the same feedback from business users who have long wanted the capability to perform self-service analysis, “I didn’t know that we could do this with Oracle Analytics!” Business users take what IT has created for them, structured data, and mashup with other sources like flat files to get the insights they need to make timely decisions. Exploring other technologies to meet user needs has led to some organizations owning multiple siloed Business Intelligence Platforms. The fact they can use OAC to not only access all the existing content they have in their production Oracle Business Intelligence (OBI) environment, but also empower business users by providing visualization capabilities is a key feature that most users believe they would like to roll-out to get immediate value within their organization.

How can you migrate from Oracle Business Intelligence to OAC?

The second most commonly asked question is, how can a customer with an existing OBI Platform migrate to OAC? This can be done in phases. Immediately rolling out Oracle Data Visualization is a quick win. Business users can start building ad-hoc capabilities. Next, the existing on-premises data warehouse can be accessed by OAC. At this stage, we would also migrate on-premises OBIEE metadata/dashboards/reports to OAC. Third, we would look to migrate the on-premises data warehouse and ETL process into cloud platforms. There are various approaches that customers have taken to migrate from on-premises OBI Platforms to OAC.

What about Machine Learning and OAC?

Machine Learning capabilities with OAC is one of the top three questions asked. In the workshop, users are able to take employee workforce profile data that has the history of their time period with the company and use the historical data to understand the correlation of attrition simply by asking OAC to explain ‘Attrition’. Additionally, we are able take the historical data to build predictive models and use the model to predict the typical attributes of employees who are likely to leave the organization. We discussed a number of use cases in the workshop around customer segmentation, predictive maintenance, and operational efficiency where this capability could be used to have an immediate impact in the organization.

Who is using OAC today?

Attendees are always interested in learning about customers who are using all of these capabilities. In the workshop, we discuss eight different case studies where customers have adopted Oracle Analytics Cloud to build a modern business analytics platform, this includes usage of Data Visualization success stories to building Enterprise Analytics on OAC. A number of success stories were discussed where customers are using OAC to integrate with Oracle Planning and Budgeting Cloud (PBCS), a data warehouse, and Enterprise Resource Planning (ERP) to get a single reporting platform for executive dashboard and mobile analytics. Users are able to come to a single reporting platform and get all the insights they need to make critical business decisions such as a consolidated view of Actuals, Budget and Forecast Data. They can drill down from a summary report to the detailed reports showing various ERP transactions.

Closing thoughts

Overall, I believe the OAC workshops are enabling our customers to see how they can leverage Oracle Data Visualization capabilities with numerous visualization tools (Heat map, donut, tag cloud, bar, circle pack, maps) to get insights by telling a powerful story. Mashup data from three different sources to get to the bottom of the issue all in a 20-minute exercise. Next, users get to test-drive data preparation and Machine Learning capabilities as a part of the Workshop Exercise using it to clean data, structure it, view a correlation and build predictive models. There is also an exercise where users take a spreadsheet to build an Essbase cube. Then students use this as one of the sources to build visually appealing and insightful visualizations. The feedback has been very positive and conversations are thought provoking.

We look forward to running more Oracle Analytics Cloud Hands-On Workshops in partnership with Oracle. if you have not attended one, I strongly recommend you to register for the next OAC workshop coming to your city. I look forward to having more thoughtful and engaging conversations with you.

]]>
https://blogs.perficient.com/2019/02/07/reflections-on-hosting-multiple-oracle-analytics-cloud-workshops/feed/ 0 235774
Key Differences between a Traditional Data Warehouse and Big Data https://blogs.perficient.com/2018/09/17/key-differences-between-a-traditional-data-warehouse-and-big-data/ https://blogs.perficient.com/2018/09/17/key-differences-between-a-traditional-data-warehouse-and-big-data/#respond Mon, 17 Sep 2018 12:24:11 +0000 https://blogs.perficient.com/oracle/?p=11780

Traditional data warehouse solutions were originally developed out of necessity. In order to run the business, every company uses enterprise resource planning (ERP) and CRM applications to manage back-office functions like finance, accounts payable, accounts receivable, general ledger, and supply chain, as well as front-office functions like sales, service, and call center. The data captured from these traditional data sources is stored in relational databases comprised of tables with rows and columns and is known as structured data. These databases are optimized for online transaction processing (OLTP) and are not easily queried for ad-hoc reporting and analysis.

So how do you make the data gathered more useful? Microsoft Excel! While Excel can be a useful tool, there are limitations and problems with the freshness, consistency, and integrity in using Excel to perform analysis. That’s where business intelligence comes into play. Gartner defines business intelligence as “an umbrella term that includes the applications, infrastructure and tools, and best practices that enable access to and analysis of information to improve and optimize decisions and performance.”[1]

The traditional approach to providing business intelligence on the data collected from business applications involves extracting the data from the transactional systems and moving it into a data warehouse which is optimized for reporting, not transaction processing. This process begins with data consolidation tools like Informatica or Oracle Data Integrator. These tools extract the data from the relational database or source system, transform it into a useable format for querying and analysis, and then load it into a final target database such as an operational data store, data mart, or data warehouse. These tools, commonly referred to as ETL (Extract, Transform and Load) tools, allow organizations to move and transform the data to build very complex enterprise data warehouse platforms.

Once the data is in the data warehouse, data rendering tools, with prebuilt dashboards and reports for users to access, pull data to provide insights into business performance for true data-driven decisions. Some reporting tools allow power users to build their own ad-hoc reports as well as various visualizations.

While a tabular report can prove useful for a sophisticated user who wants to review all the detail, less detail-oriented users may benefit from a presentation of the data in a more visually stimulating manner that contrasts the data using sizes, shapes, colors, and position to indicate relative values and potentially, make the data more meaningful.

Although both representations of traditional data warehouse content are information rich, neither version addresses the changing variety of data that organizations are accumulating to support their eCommerce or social platforms. While the path to building a data warehouse for the structured data coming out of source systems such as ERP and CRM is clear, organizations must look at other technologies to be able to provide business intelligence on the data that is not stored on relational table sources.

What is Big Data?

Big data is refers to the modern architecture and approach to building a business analytics solution designed to address today’s different data sources and data management challenges. With the exponential rate of growth in data volume and data types, traditional data warehouse architecture cannot solve today’s business analytics problems. With big data architecture, you can perform business analytics on large volumes of data stored in different applications whether in structured or relational tables or unstructured and files. The most important and complex part of a big data initiative is deciding what business problems you can solve today which can help your organization to increase revenue or reduce costs and inefficiencies.

Multi-Structured Data

Taking a step away from traditional, transactional data sources, you will find multi-structured data sources. A common example of a multi-structured data source is online commerce. The sheer volume of data created by customers through online interactions is staggering. Think of eBay and your shopping behavior. Those personal recommendations that eBay displays for you are directly related to your search and purchase history on its site. Think about Priceline and your search pattern for a trip. Priceline makes recommendations based on your viewing history. Your online search behavior is being watched and tracked and is extremely valuable to retailers. All of this information is stored in a web log and could also include a combination of images and video logs. These multi-structured data types require a different approach to storage, cleansing, and analysis.

Unstructured Data

While commerce is a great example of multi-structured data and its inherent challenges, unstructured data fits even less into the traditional BI data warehouse model.

A prime example is the data resulting from our interactions on social media, like Twitter and Facebook. Comments, likes, and trending hashtags are all different forms of unstructured data that are growing every day. When you add to this machine and sensor data, log files created by servers, and other data points captured by the Internet of Things (IoT), the scope of unstructured data available to analyze is mind boggling. These types of data are not stored in traditional databases. In fact, they are different file types altogether.

Data stored in the web, weather data, research data, and consumer data created by market research firms like Nielsen and IRI are all examples of unstructured data. Combining these data sets together can be a very powerful tool to perform predictive analytics.

The variety and volume of data that the C-suite is challenged to manage calls for a different approach to store, cleanse, and process the data. The end goal of performing real-time analytics for data-driven decisions demands a new way of thinking. Big data is the modern approach to store petabyte, exabyte and – very soon – zettabytes of data.

If your unstructured data is growing exponentially, you need big data platforms to support your organization’s analytics need.

[1] Gartner

]]>
https://blogs.perficient.com/2018/09/17/key-differences-between-a-traditional-data-warehouse-and-big-data/feed/ 0 205466
Approaches to Embrace Big Data https://blogs.perficient.com/2018/09/13/approaches-to-embrace-big-data/ https://blogs.perficient.com/2018/09/13/approaches-to-embrace-big-data/#respond Thu, 13 Sep 2018 14:10:19 +0000 https://blogs.perficient.com/oracle/?p=11784

Not every organization starts its big data journey from the same place. Some have robust business intelligence functions and capabilities, while others are doing great things with Excel. However, in order to drive efficiencies, support expected future growth and to continue its evolution to a data-driven company, most organizations are reviewing their current suite of software solutions, platforms and documenting processes and areas of improvement along with devising and executing a strategy to deploy modern business intelligence capabilities. Here are three different ways organizations can leverage a big data platform to evolve and become a more data-driven company empowering their business users to make fact-based decisions.

  1. Baby Steps from EDW to Big Data

Organizations that already have an enterprise data warehouse (EDW) built to provide insights into their structured data are expanding this platform to incorporate unstructured data, which is stored using open source software. It’s entirely feasible to use open source software to set up a big data platform. There are also vendors like Oracle and IBM who offer both hardware and software to set up a big data platform.

  1. EDW on Big Data

Companies that did not have an EDW in the past and are looking to build a modern business intelligence platform are considering a big data platform to reduce software costs and lower their total cost of ownership (TCO) to build an EDW platform.

In a traditional data warehouse, all the required data needs to be structured before being stored in a relational database; this is also called “schema-on-write.” This approach significantly increases storage costs and also adds cost to the design of data models to store the information. ETL tools required to move and store such data add to the licensing cost. Since a big data platform stores data in a data lake and creates schema-on-read by using ML algorithms instead of a manual effort, it significantly reduces the cost to store and structure data, thereby reducing the TCO to build an EDW.

  1. Big Data Cloud Platforms

Cloud-based big data projects offer the opportunity to start a big data initiative, without the upfront capital investment and with the benefit of support from a cloud partner. Google, Amazon, Oracle, and Microsoft offer various cloud services for organizations to validate their proof-of-value ideas by leveraging big data cloud infrastructure.

Regardless of the approach taken, one thing is abundantly clear: big data is not going anywhere, and AI and ML will only increase the need for a big data solution. Every enterprise, and particularly the C-suite, needs to understand big data and how it can add value to the organization in order to drive change and lead projects from the top down.  Each of the approaches outlined above can be incorporated into your organization’s big data roadmap, the development of which should be every organization’s first step.

 

]]>
https://blogs.perficient.com/2018/09/13/approaches-to-embrace-big-data/feed/ 0 205468
Creating a Big Data Platform Roadmap https://blogs.perficient.com/2018/09/07/creating-a-big-data-platform-roadmap/ https://blogs.perficient.com/2018/09/07/creating-a-big-data-platform-roadmap/#comments Fri, 07 Sep 2018 19:11:02 +0000 https://blogs.perficient.com/oracle/?p=11786

One of the most frequently asked questions by our customers is the roadmap to deploying a Big Data Platform and becoming a truly data-driven enterprise. Although the roadmap is different from customer to customer, at a high level, I would break it down into three phases:

Phase – 1 Building the Foundation

Just as you can’t build a house without a foundation, you can’t start down a big data path without first establishing groundwork for success. There are several key steps to prepare the organization to realize the benefits of a big data solution with both structured and unstructured data. The first step is to start with the structured data. To build the foundation, you need do the following:

  • Acquire and organize structured data into a high-performance and scalable data warehouse platform
  • Consolidate structured data to provide an integrated and consistent enterprise-wide view of the business
  • Shorten the time that is necessary to analyze structured data based information, and decrease the number of participants who are involved in the processing of this information
  • Build a common enterprise information model (semantic model), which can also be accessed via an open API, making it available to any delivery channel, thus providing a single source of truth
  • Define KPIs and metrics by subject area (data mart) to allow users the ability to create complex ad-hoc queries
  • Set up a scalable security infrastructure to automate authentication (access) and authorization (visibility) process
  • Provide training and workshops to facilitate user adoption
  • Build a business intelligence support team
  • Build a business intelligence competency center

 

Phase – 2 Setting Up the Big Data Infrastructure

Once the foundation is complete, you need to frame the house before running the electrical and plumbing.  For your big data journey, Phase 2 means shifting the focus to the big data infrastructure. This includes the following:

  • Equip the business intelligence layer with advanced analytics, in-database statistical analysis, predictive analysis, and advanced visualization, on top of the traditional components such as reports, dashboards, and queries
  • Deploy information discovery tools to understand “why” and provide users the ability to investigate and understand cause by exploring relationships between structured, semi-structured, and unstructured data sources
  • Set up a scorecard and strategy management to provide users the capabilities to communicate strategic goals and monitor progress over time

 

Phase – 3 Predicting the Future

The finishing touches are what make a house your home – the paint, the carpet, the fixtures. To get the most out of your big data investment you need to leverage the core information architecture principles and practices implemented in previous phases and further enhance it to:

  • Bring different data sources with large data sets together for timely and valuable analysis that are challenging to store, search, share, visualize, and analyze
  • Provide data integration capability that will cover the entire spectrum of velocity and frequency and be able to handle the extreme and ever-growing volume requirements
  • Set up an architecture so the business users do not see a divide. They don’t even need to be made aware that there is a difference between traditional transactional data and big data. Make the data and analysis flow seamless for users to navigate through various data and information sets, test hypothesis, pattern analysis, and be able to make informed decisions.
  • Provide the ability to correlate with other structured and unstructured enterprise data using powerful statistical tools allowing uses to find the needle in the haystack, and ultimately helping to predict the future
  • Provide a Complex Event Processing engine to analyze stream data in real time

 

]]>
https://blogs.perficient.com/2018/09/07/creating-a-big-data-platform-roadmap/feed/ 1 205469
Are You Valuing Data as an Asset on Your Balance Sheet? https://blogs.perficient.com/2018/07/19/are-you-valuing-data-as-an-asset-on-your-balance-sheet/ https://blogs.perficient.com/2018/07/19/are-you-valuing-data-as-an-asset-on-your-balance-sheet/#respond Thu, 19 Jul 2018 16:17:27 +0000 https://blogs.perficient.com/?p=229298

The average age of a company listed on the S&P 500 has fallen from almost 60 years old in the 1950s to less than 20 years old today. Innovative companies that are willing to embrace transformative technologies make the list today, while businesses that are hesitant to embrace change risk becoming obsolete.

Thriving companies, innovators, value their data as an asset. They use big data solutions as a competitive advantage to increase revenue, reduce cost, and improve cash flow. Data is woven into the fabric of every organization.  It records what happened, but increasingly, it’s being used to drive change and transformation at unprecedented rates.

Any business leader looking to maximize their data needs to ask themselves: Does your organization have a comprehensive data strategy? Does that strategy address both structured and unstructured data? Do you have a platform that allows your organization to analyze transactional data and social sentiment?

If you answered “No” to any of these questions, chances are you have untapped data resources or, at the very least, under-utilized data resources.

Experts claim that there is a 10x return on investment in analytics.  For some organizations, that’s the low end estimate of value they’ve created.  Industry analyst firm IDC has even estimated there is a $430B economic advantage to organizations that analyze all data and deliver actionable insights.  The bottom line is that the opportunity is big, and growing.

 

How valuable are your Data Assets?

Data has been doubling every couple of years for a while now. With the exponential rate of growth in data volume and data types, traditional data warehouse architecture cannot solve today’s business analytics problems. You need new approaches to handle the growing complexity while trying to maintain expenses and stay ahead of the competition.

Your customers, channels and competitors are digital. So are your employees and increasingly even your products. Digital transformation is critical and according to Forrester, 89% of executives see it impacting their business in the upcoming 12 months – and that survey was taken in 2017!

Also, machine learning is more than just a buzzword. It’s a core part of the solution. For many, even most companies, it’s the most important part of the solution. With the arrival of big data the data itself it quite complex as are the interactions between different data sets or types of data. Machine learning algorithms are able at some level to figure things out for themselves.

The point is that there are new types of business challenges that organizations are facing today. To get the most return from your organization’s data capital, you need to be well versed in transformative technologies that are available today and approaches that you can use to reduce cost and yield valuable business insights. You should plan to invest more in advanced analytics tools to get the most value out of big data that you continue to accumulate over time.

 

What steps are Organizations taking?

Organizations are building a modern analytics platform, they are demanding access to all the data they need. Data to inform every decision, when and where it matters. They want to rely on a modern algorithm to crunch their data. Pretty much any ML algorithm, is likely to give better or more accurate results when there’s more data to work with. Whether you are trying to build a better view of your customers’ wants and needs, or figure out why a component is breaking, you’ve got to start with as much data as possible.

Data science is becoming a key part in enabling organizations capitalize on their data. Companies are looking to use data science, and to figure out how to incorporate it into their businesses.

Finally, they want all the data and all of these algorithms and this modern technology to be put to work in support of the applications that are used to run their business. For example, if you look at Oracle’s Adaptive Intelligent Applications, it combines Artificial Intelligence, machine learning, and decision science with data captured from Oracle SaaS applications and third-party data. The unique value of these learning-enabled applications is that they learn from results, which increases their accuracy as they are used over time.

]]>
https://blogs.perficient.com/2018/07/19/are-you-valuing-data-as-an-asset-on-your-balance-sheet/feed/ 0 229298
Business Analytics with AI, ML and Blockchain https://blogs.perficient.com/2018/06/19/business-analytics-with-ai-ml-and-blockchain/ https://blogs.perficient.com/2018/06/19/business-analytics-with-ai-ml-and-blockchain/#respond Tue, 19 Jun 2018 19:00:03 +0000 https://blogs.perficient.com/oracle/?p=11782

Artificial Intelligence is already being used to conextualize data and significantly reduce efforts to transform data sets for better insights. Adaptive intelligence is being infused into cloud applications by vendors like Oracle to drive data-driven intelligence based on Machine Learning algorithms at vastly increased Speeds.

To quote Angela Zutavern, coauthor of The Mathematical Corporation “We are moving from a time when people would look at the past to interpret what that meant for the future and moving toward the true predictive power of machines, where people take action based on the results.”

So how will AI, ML and Blockchain change Business Analytics?

While predictive analytics can tell you what products and services you should be targeting to customers before they need them, predicting the future of business analytics requires a quick review of the new sources of data that will continue to increase over the next five-to-10 years: artificial intelligence, machine learning, and blockchain. Each can be a source of data as well as a mechanism to improve the analytics you are utilizing for your organization.

Artificial Intelligence

Artificial intelligence begins with creating algorithms that replicate human thought and judgment to produce insights or recommendations for users to make decisions. Theoretically, those algorithms can be run over volumes of data and at speeds that would make it impractical for individuals to analyze on a timely basis. For companies to leverage AI as a competitive advantage, they need to invest in developing the algorithms that are making those recommendations.

Unfortunately, there is no “easy” button to do this, but the right partner and technology can help you analyze and develop the rules and logic to incorporate into your AI solution to improve the quality of the results. A great example of this is a problem IBM ran into while preparing Watson to compete on the game show, Jeopardy! The team continued to see improved accuracy in answering questions as they incorporated more information into Watson’s memory; that is until they loaded one book that decreased the accuracy of the whole system. What book was that? The Bible. Despite being the most widely published book in history, loading The Bible decreased the accuracy of Watson’s artificial intelligence. The takeaway is that a book largely based on parables involving faith does not help a machine determine the answers to other factual questions. In fact, it has the exact opposite effect. For business, that means that incorporating unverified information into your AI – such as inflated reviews by trolls or bots on Amazon – may lead to the wrong recommendations coming from your AI system.

Machine Learning

In a contrast to AI which can be viewed as having a wider purview, machine learning takes a more narrow view in trying to incorporate specific data regarding specific items to improve the performance of a task outcome. A good example and application of machine learning is the accuracy of the iPhone in recognizing words and phrases before you finish typing. The improvement in recognition comes from a very specific data set (what you type) over the course of time, with the intent to improve the accuracy of your typing in the future. What this represents from an analytics perspective is a very targeted, and often structured, data set that can and should be incorporated into your data lake. If an organization could view the trending words that Apple is autocorrecting on the iPhones of its field service staff, it may put the company in a better position to address challenges related to those words more proactively to improve customer service across a service organization distributed across the country.

Blockchain

Blockchain potentially represents the most transformational transaction recording innovation since EDI. The core concept of Blockchain is a distributed ledger that is stored and validated across numerous locations that are all connected and have a validation mechanism that prevents anyone from making an unvalidated change. Unlike Wikipedia where any person can edit the page on Abraham Lincoln to say he lived until 1901 and post it, only to have it corrected after the fact, Blockchain proactively validates the transactions before they post.

For example, if Company A sells a machine to Company B and the transaction is recorded in a Blockchain ledger, both parties need to validate the transaction via the proper protocol across the chain. Since the entire ledger is distributed, anyone joining the Blockchain network gets the entire copy of all the ledgers into their machine. You can see transactions of your competition that are in the Blockchain network, just like they can see your transactions to determine their go-forward strategies. It’s being perceived as a win-win for both parties. If there are sensitive data, there could be a concern of moving the data out of the Blockchain. Therefore, there could be an opportunity to run Analytics, ML or AI engines inside the Blockchain, on the same server that executes the transactions.

Since Blockchain allows all participants to see all data across the network, it enables BI/AI/ML at every network. Eventually, there may be a reporting protocol, encryption/decryption or data aggregation services for Blockchain ledgers, companies may need a new infrastructure and approach before they can analyze insightful Blockchain ledger data into their AI or ML systems.

]]>
https://blogs.perficient.com/2018/06/19/business-analytics-with-ai-ml-and-blockchain/feed/ 0 205467
Cloud Transcends Tech – New Strategy and Business Model for CEOs https://blogs.perficient.com/2018/04/18/cloud-not-just-a-technology-but-a-strategy-and-a-new-business-model-for-ceos/ https://blogs.perficient.com/2018/04/18/cloud-not-just-a-technology-but-a-strategy-and-a-new-business-model-for-ceos/#respond Wed, 18 Apr 2018 22:53:02 +0000 https://blogs.perficient.com/oracle/?p=11864

Cloud adoption will need to be part of a key strategy for CEOs that are looking to grow revenue Q/Q, Y/Y, while cutting costs and taking the market share from their competitors. Kicking off the Modern Finance Experience 2018, Mark Hurd, CEO of Oracle hammered home as to why Cloud is not just a modern technology but a generational shift. Hurd made some bold predictions that by 2020, even highly regulated industries will shift 50% of their production workloads to the cloud and 90% of all enterprise applications will feature integrated AI capabilities.

Hurd talked about how Macro economic conditions effects companies’ ability to grow their revenue. With the worldwide GDP growth at 2% and majority of that growth coming from China, based on the GDP growth, it’s not easy to grow revenue. Growing revenue is the priority of all the CEOs of a publicly traded companies.  Therefore, companies have to lay out a new business model to get the revenue growth. Cloud technology can enable CEOs to invest in modern applications while cutting expenses and improving Cash Flow/Earnings. Companies that don’t modernize their applications will fall behind losing their customers to the competition. Only 50% of the S&P 500 companies that were in 2000 are still around.

Why will CEOs direct their CFO and CIO to move to the Cloud?  Key takeaways:

  • Most of the applications are from 1998, which is long time ago, given the fact that internet was still evolving, social, mobile wasn’t around. Maintaining these legacy applications exhausts about 80% of the IT budget, simply trying to keep the status quo. Old applications running in old infrastructure. With cloud operating expenses go down.
  • Companies would rather outsource competencies around maintaining a Data Center, Data Security to vendors like Oracle and focus on retaining their customers and gaining market share.
  • With the invention of autonomous database, ongoing maintenance and patching can be automated with no manual intervention.
  • New companies in almost every sector are emerging with new business models and modern IT applications to increase customer services and take away customers from their competition. Companies need to continue to invest in modernizing their applications to stay relevant.
  • Cloud infrastructure and applications cost less, you get the latest innovation without having to do any work. Emerging technologies around Machine Learning (ML) and AI is engineered directly into these cloud applications.
  • AI and ML in these modern applications will continue to advance productivity
  • Lastly, data is more secure and you completely shift the risk to vendors like Oracle by moving to Oracle Cloud.

 

]]>
https://blogs.perficient.com/2018/04/18/cloud-not-just-a-technology-but-a-strategy-and-a-new-business-model-for-ceos/feed/ 0 206172
Five Common Use Cases of Big Data Adoption by Organizations https://blogs.perficient.com/2018/04/02/five-common-use-cases-of-big-data-adoption-by-organizations/ https://blogs.perficient.com/2018/04/02/five-common-use-cases-of-big-data-adoption-by-organizations/#respond Tue, 03 Apr 2018 00:56:57 +0000 https://blogs.perficient.com/oracle/?p=11772

Big Data Analytics Platforms continue to be adopted by different organizations to get unique insights for their business. In this post, I will cover five common use cases that we are seeing with our customers who are adopting Big Data for their Business Analytics need.

1 ) Data Warehouse Modernization

As organizations look to modernize their business analytics platform, whether it be moving away from spreadsheet processes to a fully automated data warehouse solution, or simply to modernize their legacy data warehouse and reporting platform, the requirements remain the same – a technology, which is cost-effective, open source, highly scalable, and relatively fast to deploy. Big data solutions are a leading contender to deploy a modern enterprise data warehouse.

Building a traditional data warehouse starts with documenting business requirements, designing a data model to support those requirements, and using a data cleansing and consolidation tool to organize data.  A significant amount of time is spent gathering requirements, data modelling, ETL and reporting.  Organizations are demanding technology to simplify the process of data storage and data management. Business users want to be able to answer their own questions as needs change without relying on IT.

A big data solution can not only fulfill your business analytics requirements, but also modernize your platform, ensuring you are future proofing your architecture to address the growing volume, velocity and variety of data that your organizations may face as the need for data insights continues to grow.

2 ) Customer 360°

Imagine the possibility of getting all your customer data from your CRM applications to marketing and sales to commerce, service and social. The ability to tie together these disparate data sources enables cross-functional analysis between different departments. A unified customer experience means the service department has historical sales data at its fingertips and knows long-term customers from one-time buyers. It means the customer service representatives know what your customer is calling about. And you know the value of every customer as does your retention team.

The platform that can address this need should be scalable to allow for exponential growth given customers create data with laptops, mobile applications and a myriad of different devices. The platform should be designed to stored petabytes of data allowing for scalability to zettabytes of data. Traditional data warehousing can’t do that.

A modern approach is the path to get insights out of the data you have without spending significant time focused on data management, data structure or data cleansing. There’s rapid adoption of big data platforms by organizations looking at transformational technologies to fulfill these types of requirements. Moreover, the ability to perform predictive analytics on the data can take provide insight into investments like never before. Should you open a new store in southern California? Would offering a new color of widget impact sales?

With big data architecture, you can capture data on consumer behavior, perform advanced analytics, and predict the return on investment (ROI) even before product launch. Promote your product at a price and place where the solution predicts the greatest lift or ROI. This allows organizations to formulate a logical marketing strategy, quantify benefits and increase its ROI. With big data platforms, you can achieve a 360-degree view of the customer in a cost effective and highly scalable manner in a shorter amount of time.

3 ) Internet of Things

The Internet of Things (IoT), broadly defined as a network of internet-connected objects that collect and exchange data using embedded sensors, is unquestionably a leading source of an incredible volume of data.

What can we do with all the data generated by these sensors? Turns out, a lot. There is significant value in being able to collect the data via everyday objects like machines, server logs, weather sensors, manufacturing plants, to name a few and analyze them. Manufacturing, energy, construction, agriculture and transportation are some of the industries that are realizing tangible benefits by leveraging IoT for data-driven decisions.

According to IDC in 2018, “IoT spending among manufacturers will be largely focused on solutions that support manufacturing operations and production asset management.” Manufacturing plants use sensors to monitor performance of machines and systems ensuring maintenance issues are handled proactively without having to shut down operations, thereby continuing to maintain the throughput and customer expectation. There’s significant cost savings in fixing faulty parts in expensive equipment if detected early on before a minor problem becomes a major concern.

Tracking product movement by monitoring the weather is another area that could use IoT to get valuable insights. If a company’s sales are impacted by inclement weather, like a nor’easter, a manufacturer of small generators may want to align its supply chain to be able to manage the expected spike in sales for the day or a given period.

Predictive algorithms using historical data, tying that with current sensor data to be able to make more accurate predictions that will impact profitability is what separates forwarding thinking companies from those entrenched in dated current systems and processes.  Big data platforms are widely used today to fulfill these types of business requirements.

IDC forecasts worldwide spending on the Internet of Things to reach $772 billion in 2018 with manufacturing, transportation and utilities, making up 45% of the projected spend.

4 ) Recommendation Intelligence

When you search for an item on Amazon, the web commerce platform learns about your interests from your shopping and purchase behavior. It starts making recommendations for similar products as well as complementary products you may want to buy based on what other customers bought together with a given product. You may not have been shopping for the additional item from the onset, but the mere suggestion that others bought batteries when purchasing the same electronic toy may be enough for you to add it to your cart.

Netflix is another great example of recommendation intelligence in action. When you select a movie or TV show you want to stream on Netflix, it learns from your likes and dislikes via thumbs up and down and uses predictive models to make recommendations around your interests. There are very successful companies capitalizing on this algorithm. Pandora is yet another example. Based a single song or selection of artists, the platform intelligently starts playing songs to our liking. The more feedback given regarding preferences, the more intelligent it gets and plays songs likely to suit the mood or ambience desired.

This engine can also be used to handle customer calls and make appropriate recommendations that may result in upsell or cross-sell opportunities. Today’s customers interact with these platforms expecting the intelligence to be built-in. It starts to become second nature. The key technology behind this intelligence is a big data analytics platform. A cost effective way to store huge volumes of data, to be able to perform advanced analytics on the collected data sets and build predictive models. Use these models to make appropriate recommendations. Most importantly continue to improve the model based on user feedback.

What are use cases relevant to your industry? Deployment of a solution capable of recommendation intelligence can directly result in increased revenue and hence the bottom-line growth. It’s a competitive differentiator.

5) Sentiment Analysis

United Airlines lost almost $1 billion dollars in market cap for the removal of a passenger by force. A video of the encounter was widely shared not only by cable, but also by social media. The airline was boycotted by frequent flyers and new passengers globally. The backlash from sharing of the video taken by other passengers was swift and painful for the world’s third largest airline.

In the age of a social media, it only takes one improper handling of a situation and a disgruntled customer to share his story that could start trending and before you know it, everyone globally is talking about it. In the case of United, it was the bystanders that took to social media and not the passenger ejected.

Snapchat lost $1.3 billion dollars because of a single tweet shared by Kylie Jenner. It wasn’t something bad about the platform, it was a simple question posed about platform usage. Social media can be an extremely powerful channel to build a brand and maintain a reputation. If there are problem areas resulting in angry customers sharing their frustration and concern, there has to be a strategy to be able to address those concerns and problems early on before it catches on and starts trending.

For visibility into these insights, organizations need tools that can allow customer representatives or marketing to monitor and analyze various social media platforms such as Facebook, Twitter, Instagram, and Snapchat in real-time. The traditional approach to analyzing these types of data is not going to work, it has to be real-time and the technology should handle very large and scalable volumes of data. Organizations are using big data platforms to be able to perform sentiment analysis and to structure a customer representative team to address such concerns as they arise.

]]>
https://blogs.perficient.com/2018/04/02/five-common-use-cases-of-big-data-adoption-by-organizations/feed/ 0 206167
Moving Off Discoverer https://blogs.perficient.com/2017/09/22/moving-off-discoverer/ https://blogs.perficient.com/2017/09/22/moving-off-discoverer/#respond Fri, 22 Sep 2017 15:24:16 +0000 https://blogs.perficient.com/oracle/?p=10085

If you missed our webinar, A Customer’s Take on Moving from Discoverer to Oracle Business Analytics, the on-demand recording is now available for immediate viewing. If you are currently using Discoverer, hear directly from a Perficient customer, Gary Aragon with Group Voyagers, who made the move from Discoverer to Oracle Business Intelligence Enterprise Edition (OBIEE) in just nine weeks!

During the webinar we asked a couple of polling questions about why companies have delayed moving off Discoverer and future plans. The majority of webinar attendees, 67%, cited the reason they have not moved off Discoverer is because it works. Although I can understand that perspective, Extended Support for Discoverer ended in June 2017. Companies that are still running Discoverer may find themselves in a bind in the near future as Oracle will no longer provide new patches. The Discoverer environment may be stable today, but new Windows updates could wreak havoc on Discoverer and Oracle won’t be there to fix such issues.

Another common reason customers may delay moving off Discoverer is that they have thousands of reports in their production environment. The length of time it would take to migrate all those reports is a valid concern. In my experience, I have learned that business users often make minor changes to a given report and save many different versions of the same report for a different time period or a different business unit. The various filter criteria available has led to multiple iterations essentially of the base report. As a rule of thumb, if a customer tells us they have thousands of reports they need to migrate, when we take a closer look at those reports and conduct a fit gap, we generally find we can condense that number to around 200 or fewer distinct reports that are actually needed. We can effectively recreate those 1,000 reports needed by applying different filters or selection criteria on top of the unique reports.

From a timing perspective, it could take months or a matter of weeks to migrate Discoverer reports if a customer already owns pre-built Oracle BI solutions, which provides a foundation we can use to migrate their Discoverer reports.

Customers often ask me how we can build real-time reports which business users have access to today in Discoverer on an Oracle BI platform. Often the Discoverer reports are built using materialized views directly on top of Oracle E-Business Suite (EBS) tables. Business users are accustom to real-time reports, so what we typically try to do when we move from Discoverer to Oracle BI is to apply data warehousing best practices. We are essentially going into Oracle E-Business Suite and pulling the data and transforming it and loading it into different star schemas. The benefit of that approach on the EBS side is that the BI platform does not tax your EBS platform because we’re pulling the data on a nightly basis. Granted, the data is essentially a day old, but what we try to do is understand from the business the justification behind the real-time need. We approach it one of many ways. One approach would be to build materialized views for just those key reports where there is a clear justification for real-time data and combine that with BI Applications data warehouse and data model. Another approach is where you have a real-time reporting need for a large number of reports.  In this instance, a tool called Oracle GoldenGate which would be able to replicate two different instances might be advisable.  There are a number of different architecture options that customers could consider.

If you are using Discoverer today and are exploring your options, you have choices.  Some customers will opt to move to another on-premises solution.  Others will look at a hybrid solution or make the move to the cloud. If you’re unclear as to what decision is right for your company, you’re not alone.  Look to an implementation partner that has broad experience with OBIEE and Oracle BI Applications as well as Oracle BI Cloud Service (BICS) and Oracle GoldenGate.

]]>
https://blogs.perficient.com/2017/09/22/moving-off-discoverer/feed/ 0 206098
How to Empower Your Business Users with Oracle Data Visualization https://blogs.perficient.com/2017/05/03/how-to-empower-your-business-users-with-oracle-data-visualization/ https://blogs.perficient.com/2017/05/03/how-to-empower-your-business-users-with-oracle-data-visualization/#respond Wed, 03 May 2017 13:56:31 +0000 https://blogs.perficient.com/oracle/?p=8894

Many shifts are happening today, with technology changing at such a pace, business users are relying on data to keep up with the industry and beat the competition. Business users are expecting more than dashboards and reports. They don’t want to be handed off with a set of reports and dashboards which are pre-built by IT for them; they are demanding seamless self-discovery and dashboarding. They want to be able to take what is given and combine that with other data sources and be able to tell a more powerful and compelling Visual story with narrative and snapshots. IT is being challenged to provide the business what they want. Flexible Analytics that meet the needs of many types of users is key.

As business users are relying on tools and data, they are achieving higher levels of proficiency and therefore demanding more self-service analytics. Organizations, specially the business user community who had purchased a standalone data visualization tool are now moving away from Standalone data visualization tools that are not able to provide data governance, single source of truth or enterprise level security. Organizations realize that to Protect your business you have to Empower your users.

Although the cloud adoption is accelerating, most organizations still have hybrid applications with both on premise and cloud data sources. Business users want to be able to combine data sources regardless of where they are, fairly quickly so timely decisions can be made and acted upon.

IDC predicts that by 2020, spending on self-service visual discovery and data preparation market will grow 2.5x faster than traditional IT-controlled tools for similar functionality. Those that have a plan will see great reward over their competitors. IDC projects $430 billion dollar advantage.

However doing so will have its own set of challenges. It is not going to be easy. IT has to adapt to the digital transformation.

For business users to be able to focus more on strategy, means streamlining everyday activity to free up resources.

What is Oracle’s strategy? Oracle understands that when starting out, you need a system that will fit your needs right away without causing major disruption to your other systems or architecture while adding significant value to your business users every step of the deployment and adoption process.

Oracle’s strategy is to ensure whichever capabilities and deployment choices customers need. As they grow, they can easily expand into the next level without any loss of work or content. There are mainly three product’s that is part of the overall strategy:

First, Data Visualization Cloud service or DVCS is for smaller deployments that most often see teams or departments owning the analytics themselves without reliance on IT.  Oracle has Introduced a Groundbreaking Visual Analytics in the Cloud with New Data Visualization Cloud Service. Allows users to perform Better Data Visualization in the Cloud. Oracle manages the environment.  Customer’s also have the option to use Oracle Data Visualization Desktop. This is what is covered in this blog post for the most part.

Next is BI Cloud service or BICS, which is for medium deployments that require a more sophisticated analytics solution.  As such the business cannot operate entirely on its own requiring IT to manage things like the semantic layer. A semantic layer is built by automating a process to pull data from your various on-premise or cloud applications like ERP or CRM. You need a functional knowledge to be able to pull the data and structure it in a format that will simplify reporting. Again, Oracle manages the environment for you.

Third, there is Oracle Analytics cloud or OAC which is for larger deployments where IT will own the deployment as it is customer managed.  Customer management introduces more flexibility to how IT can deploy/manage/administrate their analytics.  With IT managing their OAC platform they can provide analytics as a service to the rest of the organization.

Each solution builds on the previous solution(s).  Each introducing another level of analytics ending with Oracle Analytics Cloud or OAC that provides everything from data visualization, through to governed metrics and dashboards to advanced analytics and scenario modeling.

Liberate All Data From All Sources: Oracle Data Visualization Cloud Service eliminates the complexity typically associated with blending and correlating data sets. Easy, automatic data blending allows you to combine data from a variety of sources—Oracle and other SaaS applications, on-premises systems, external sources and personal files—and immediately gain new insights using visual analysis.

The quality of decisions is always affected by data and time.  Having the right data that has considered as many possibilities as possible means greater chance that the right direction can be chosen.  That data must of course be current and accurately reflect reality at the time of the decision.

The data landscape however is changing fast.  Data is not located in a single well defined view, instead its fragmented across the enterprise and even outside in 3rd party or public sources.  New technologies continue to arise providing new and difficult approaches to data storage, organization and access.

Tools required for connecting to and retrieving data from sources, enrichment and visualization have been fragmented, often with multiple different vendors, requiring multiple skillsets to simply retrieve and visualize the results needed.

With Oracle Data Visualization you can connect to the right data from any source.

  • That means access to a vast number of supported data sources, on-premises and in the cloud, databases and applications.
  • And ensuring that as new industry leading sources arise, that Oracle will continually add connectors to those sources.
  • Once you have connected those data sources, users can blend and enrich that data with powerful built-in data preparation capabilities.

Improve All Data with Powerful Data Prep: Simply being able to connect data is not enough. You need a tool that has Artificial intelligence and Machine learning built into it to automatically infer connections between data sets, and be able to display in an easy interface so you can see how data is related and add or change connections as necessary.

The solution is to make sure that you can always connect to the right data at the right time.  That means accessing all relevant data that is needed to support your activity, decision or process, regardless of where it may be stored.

It means you can stay ahead of the curve with the help of proactive analytics that are smart.  Knowing where you are or who might be calling you.  That can trigger your analytics to ensure you have your numbers ready to go.

Finally, a singular process, in a single tool, that requires little training, that can be driven by a standard user, is able to perform all that is required from data access to data visualization, and everything in-between.

Find Hidden Patterns with Smart Data Discovery: With Oracle Data Visualization you can automatically detect different data patterns. Visual analytics masks complexity, enabling very sophisticated data exploration by simply clicking to apply filters and drill, and dragging and dropping to create compelling dashboards. You can take your analysis even further with an easy Expression editor, where you can use prebuilt functions to visually create custom filters and calculations, to quickly extend and enrich your data.

A unified user experience blurs the lines between dynamic discovery, dashboarding, and presentation, creating a seamless and richly contextual environment that keeps your exploration fast and fluid.

Talk to your data and tell better stories: Sometimes you know the data you are working with, and sometimes you don’t. You need some guidance and the ability to search your data. Oracle Data Visualization has Powerful search, guided navigation, and sophisticated filtering that work together intelligently to provide an easy, interactive path through your data, helping you find exactly what you’re looking for.

You can also use your voice to search through your datasets and ask it to generate a report for you.

Dashboards and reports would add more value if it were able to tell a story. Oracle Data Visualization gives you the ability to tell your story and share your content with other users. With built algorithms and Artificial intelligence, it is easy to capture insights as visual stories, saving story points (snapshots of the analytical moment-in-time) and adding comments to highlight key points and discoveries. Stories are live and can be securely shared with anyone with permission; co-workers can click a link and be brought into the story, build upon it, and share it in turn, enabling the rapid, dynamic collaboration that drives improved decision-making and faster action.

Governed Self Service at Enterprise Scale: Being able to balance user autonomy with governance and policy is part of building a BI competency centre for every organization. With Oracle Data Visualization, organizations can find their own balance between user’s self-service and governance. The scale is now adjustable by you.

This allows for the combining of all relevant information including governed systems of record and un-modelled personal data, third-party data or other external information that becomes that perfect custom combination that can contribute to your business activity. And, know that you won’t be kept waiting for your results because intelligent queries optimized for the specific data sources will keep performance high.

The Pivot to Self Service: In the past, deployment of Enterprise Data warehouse platform meant, the data requirements had to be known before the start of the engagement and then you would be able to roll out analytics solution for different business user group whether it Finance or Sales or HR for what was captured.  With the adoption of Business Intelligence platforms, users are now asking for self-service discovery tools so they can mash-up the datasets provided by IT with other data sources and be able to perform self-service analytics.  They want to be less reliant on IT so they are able to analyze and make decisions on time-sensitive information. Be able to combing for example Enterprise Data warehouse with your flat files or social media data to be able to perform more intelligent marketing analytics, so it is a top down approach than what it used to be before.

Today, with Oracle Data Visualization you can combine Enterprise Data which has a single-source-of truth, data governance framework and security infrastructure with other data sources to be able to perform Departmental self-service analytics. Enabling business users to spend more time on analysis than creating a service request with their IT trying to add a new column or a field to the existing data warehouse.

For customers who own prebuilt BI Applications, this would mean allowing your Business Users to use Oracle Data Visualization to source from subject areas that are in production along with any other flatfiles, Essbase cubes to be able to mash-up the data and perform rich visual exploration and tell a more powerful story.

Where do we start? We hear from many Analytics customers that they understand WHY they should move to Cloud, but the immediate follow up is: WHERE DO WE START?

There are three important steps on your journey to cloud. Starting with the easiest and quickest to deploy and each successive step builds on the previous step.

First, the quickest and easiest step is known as having “Data Hybrid.” This is analytics in the cloud with live, on-demand access to existing data sources, both cloud AND on premises!  This is definitely what our customers are talking about as a popular starting point for moving to cloud.

Second, Lift and shift is the next logical step that requires a little more effort but also brings even more value. Transporting analytic applications which includes content, security, users and data to and from the cloud.  When considering a move to the cloud, you’re are faced with a key decision: re-architect the application(s) or employ a “lift-and shift” approach. Re-architecting the application requires extensive planning, time, significant effort, and costly resources to implement. The lift-and shift approach, however, is gaining popularity because it offers hybrid cloud business intelligence without having to worry about data security and recreating metadata in the cloud.

Third and The next logical step is to expand your analytics to realize the greatest potential to innovate in the cloud.  This includes enabling machine learning and artificial intelligence and the addition of cloud capabilities like modelling, big data discovery, mobile access, or into other SaaS, PaaS and IaaS options.

You want to be able to roll-out new applications quicker and Enable self-service on existing applications.

ABOUT THE AUTHOR:

Shiv Bharti is the Practice Director of Perficient’s (NASDAQ:PRFT) National Oracle Business Intelligence Practice. Shiv has solid experience building and deploying Oracle Business Intelligence products. He has successfully led the implementation of more than 75 Business Analytics and custom data warehouse projects.

Shiv is a thought leader in the Business Analytics space and a frequent speaker at large conferences like – Oracle Open World, Collaborate, KScope as well as Oracle User Group, sharing his perspectives and thoughts on industry trends,strategies, best practices and case studies.

 

]]>
https://blogs.perficient.com/2017/05/03/how-to-empower-your-business-users-with-oracle-data-visualization/feed/ 0 206038