Myles Gilsenan, Author at Perficient Blogs https://blogs.perficient.com/author/mgilsenan/ Expert Digital Insights Fri, 05 Feb 2021 22:46:31 +0000 en-US hourly 1 https://blogs.perficient.com/files/favicon-194x194-1-150x150.png Myles Gilsenan, Author at Perficient Blogs https://blogs.perficient.com/author/mgilsenan/ 32 32 30508587 Top 10 New Features in Oracle Analytics Cloud 5.9 https://blogs.perficient.com/2021/02/01/top-10-new-features-in-oracle-analytics-cloud-5-9/ https://blogs.perficient.com/2021/02/01/top-10-new-features-in-oracle-analytics-cloud-5-9/#respond Mon, 01 Feb 2021 15:18:53 +0000 https://blogs.perficient.com/?p=287146

We all know or have heard that the Cloud brings many benefits for both software development companies and consumers of Cloud software. Nowhere is this more evident than in the pace of enhancements and upgrades. In the old days of on-premises computing, software companies had to make sure that their software and all up upgrades worked with many operating systems, databases, middleware and security platforms. Managing this chaos was often done via a spreadsheet jokingly referred to as the ‘matrix of death’.  How times have changed and we, the consumers, reap the benefits!
Oracle Analytics Cloud (OAC) is automatically upgraded every quarter. These upgrades are not just for fixing known issues, they often include significant enhancements and the 5.9 update to OAC is no different.
With the 5.9 update Oracle continues the tradition of improving and simplifying the end user experience through easy-to-use but powerful features that let users do more while doing less.
Top 10 New Features of Oracle Analytics Cloud 5.9
1. Text Tokenization – with this new feature you can easily analyze text fields. For instance, let’s say you have a database with complaints about different car models and the complaint descriptions are in a free form text field.  In OAC, you can create a data flow and use the text token capability to analyze how many times different words occur for different car models.  There is no coding required for this feature. 
https://bit.ly/39y1Z9S
2. Improved Sorting Capability – when analyzing data there are few utilities that are more frequently used than sorting.  The correct sorting makes data much easier to understand.  OAC 5.9 has added a number of sorting enhancements:

  • changing the secondary sort of a chart will have no impact on the primary sort
  • sorting by a measure will take priority over sorting by attributes
  • it is now possible to sort by measures or attributes that are not directly displayed on the chart but rather are in the ‘Size’ area or ‘Tooltip’ area.
    https://bit.ly/3t994FL

3.  Data Preparation Enhancements – you might be aware that Oracle Analytics Cloud comes with a powerful set of capabilities related to data preparation/wrangling.  These days business analysts are doing a lot of the data preparation work that previously might have been done by IT and tools/shortcuts are greatly appreciated.  OAC 5.9 includes the following data prep enhancements:

  • automatically, with one click, fix missing leading zeroes in US zip codes
  • single click trim function – remove leading or trailing extra spaces
  • validation of formulas – can quickly validate formulas created during the data preparation phase.
    https://bit.ly/3tctzRR

4. Filter Improvements – OAC allows users to establish filters at the dashboard level, which will apply to all the visualizations on the dashboard, as well as at the individual visualization level.  With the new 5.9 enhancements, users can now easily drag filters established at the visualization level to the dashboard level and vice versa.  This allows users to move more quickly as they analyze data sets.
https://bit.ly/3cqxZ1l
5. Frequent Itemset Analytics (Market Basket) with Data Flows – this enhancement is a great example of increased integration between Oracle Analytics Cloud and Autonomous Data Warehouse.  For many years the Oracle database has had advanced analytics like market basket analysis built into it – however previously you had to be a database programmer to access these capabilities.  With the integration of OAC and ADW, business analysts can easily take advantage of the advanced analytics capabilities built into the Oracle database through a simple point and click interface in OAC called ‘Data Flows’.  With this enhancement a business analyst will launch the market basket analysis from within OAC but the actual processing will be done leveraging the power of the Oracle database (this is called ‘function shipping’).  The results of the market basket analysis are then available in OAC for analysis. 
https://bit.ly/2NWLaND
6. Ability to Visualize Oracle Machine Learning Model Metadata in OAC – this enhancement is another great example of leveraging the increased integration between ADW and OAC.  First a little background.  With Oracle Machine Learning, Oracle has built into the database over 30 machine learning algorithms that can be used by professional data scientists to create machine learning (ML) models ((https://www.oracle.com/data-science/machine-learning/).  With this enhancement available in OAC 5.9, machine learning models created in the Oracle database can now be registered for use within OAC.  Additionally, all the metadata related to those ML models can now be visualized in OAC through the standard OAC interface.
https://bit.ly/3tfuMrC
7. Explain Predictions on Your Data Using OAC Machine Learning Model Output Options – being able to explain the predictions made by machine learning models has emerged as a critical requirement.  When machine learning models developed in the Oracle database are processed, they create output that explains how the prediction was arrived at for each and every record (i.e., what attributes contributed to the prediction with the weighting factor identified).  This enhancement allows that output to be automatically available for analysis in OAC.  This puts business analysts in the position of being able to explain how a model arrived at its predictions (rather than just saying “the model said so”).
https://bit.ly/3r5hNa3
8. Add New Map Backgrounds Using Web Map Service (WMS) or XYZ Tile Maps – with this enhancement it is now possible to add new map backgrounds from the Console of Oracle Analytics Cloud (you need to have access to the Console).  Spatial analytics can add very important context to visualizations and the map background can bring visualizations to life.
https://bit.ly/3t99CLP
9. Increased Limits on the Number of Rows Returned for Queries – great to see the power of OAC increasing. The maximum number of rows returned when you query data for visualizations, analyses, and dashboards has increased to:

10. Improved Progress Bar Behavior – the OAC interface and user experience continues to be enhanced.  Previously when data was refreshed a horizontal blue bar would appear across the screen.  Now a small circle in the upper right corner indicates data refresh progress (like when you download an app on a mobile phone).  It might seem like a small thing but it is more subtle and improves the overall visual experience.

We hope you enjoy these new features of OAC 5.9!

]]>
https://blogs.perficient.com/2021/02/01/top-10-new-features-in-oracle-analytics-cloud-5-9/feed/ 0 287146
Turbocharge Your ERP Analytics with Fusion Analytics Warehouse https://blogs.perficient.com/2020/10/24/turbocharge-your-erp-analytics-with-fusion-analytics-warehouse/ https://blogs.perficient.com/2020/10/24/turbocharge-your-erp-analytics-with-fusion-analytics-warehouse/#respond Sat, 24 Oct 2020 12:21:46 +0000 https://blogs.perficient.com/?p=282513

In late 2019 Oracle released Fusion Analytics Warehouse (FAW). FAW is built on Oracle Analytics Cloud (OAC) and powered by Oracle Autonomous Data Warehouse (ADW). It provides personalized application analytics, benchmarks, and machine learning-powered predictive insights across all line-of-business job functions and business processes for Oracle Cloud Applications. It consists of prebuilt analytic applications that include a data pipeline, data warehouse and prebuilt KPI’s, metrics, reports, and dashboards. FAW is purpose-built for Oracle’s Cloud SaaS applications: ERP Cloud, HCM Cloud, SCM Cloud, and CX Cloud.  The first two modules of FAW have been released: Fusion ERP Analytics and Fusion HCM Analytics.  Additional modules for SCM Cloud and CX Cloud will follow.

In today’s data-driven world, no process can be considered complete or fully optimized without addressing the analytics related to the process. Although FAW can be deployed at any time – either during the ERP Cloud implementation or after – there are significant benefits to be realized from deploying FAW concurrent to the ERP Cloud deployment. Some of these benefits are:

  1. Improved design for ERP Cloud – especially in the areas of Chart of Accounts and flexfield design
  2. Improved ROI of ERP Cloud deployment through better analytics
  3. Increased visibility of ERP Cloud deployment by providing key metrics, KPI’s, and dashboards to senior executives and business managers
  4. Take advantage of the machine learning and predictive analytics available within Fusion ERP Analytics
  5. Leverage economies of scale from common team members for ERP and Analytics
  6. Distribute critical KPI’s and metrics via mobile devices
  7. Create a data foundation for future analytics and leveraging of machine learning

Fusion Analytics Warehouse Goes Beyond Transactional Reporting

The modules of FAW complement Oracle Transactional Business Intelligence (OTBI-which is part of ERP Cloud) and address the need for analytics beyond transactional reporting. Whereas OTBI is focused on transactional reporting with limited historical reporting and works with small volumes of data sourced from Oracle Fusion apps only, Fusion ERP Analytics is geared towards strategic and advanced analytics, supporting deep historical analysis, non-Oracle Cloud data sources and large volumes of data. Fusion ERP Analytics helps executives and decision-makers improve business performance and gives analysts the tools to uncover hidden insights.

Fusion ERP Analytics includes 15 subject areas and over 100 data warehouse tables covering General Ledger, Accounts Payable and Accounts Receivable. It comes with over 50 Financial KPI’s that are prebuilt to leverage the data in your ERP Cloud system. With Fusion ERP Analytics you can do the following:

  1. Quickly access critical Financial KPI’s and metrics with comparison to target values
  2. Leverage machine learning to drive alerts and discover relationships among metrics
  3. Analyze historical trends
  4. Gain insights into drivers of profitability
  5. Understand your cost structure and expenditure patterns
  6. Automatically gain access to key Financial ratios like ROE, ROA, Debt to Equity, Current Ratio and more
  7. Extend existing KPI’s and metrics or define your own KPI’s
  8. Drill directly from KPI’s and metrics into underlying detail
  9. Take immediate action on insights by linking back into the Cloud SaaS applications to execute transactions (e.g., purchase more inventory)
  10. Integrate non-Oracle Cloud data sources for true enterprise-level analytics

Oracle ERP Cloud and Fusion ERP Analytics represent a powerful combination delivering an integrated platform for flawless business process execution and deep, machine-learning powered analytics to drive improved business performance and continuous improvement. Perficient’s credentialed Oracle expertise and team of highly skilled implementation specialists can guide you on your journey to world-class business process execution and analytics.

]]>
https://blogs.perficient.com/2020/10/24/turbocharge-your-erp-analytics-with-fusion-analytics-warehouse/feed/ 0 282513
Top 10 Things You Didn’t Know About Data & Analytics in the Oracle Cloud https://blogs.perficient.com/2020/10/15/top-10-things-you-didnt-know-about-data-analytics-in-the-oracle-cloud/ https://blogs.perficient.com/2020/10/15/top-10-things-you-didnt-know-about-data-analytics-in-the-oracle-cloud/#respond Thu, 15 Oct 2020 14:24:03 +0000 https://blogs.perficient.com/?p=281654

The world of modern data and analytics continues to evolve and is very exciting.   The change really began in earnest about 10 years ago with the introduction of Hadoop and big data processing.  Suddenly corporations could analyze much larger data sets than before and could extract insignts from data that could transform companies and industries.  While this explosion of data use cases started on premises, it is most certainly migrating to the Cloud as the primary platform.

Oracle already had a world class database.  Over the last several years Oracle has upgraded its Oracle Cloud infrastructure.  They started from scratch and rebuilt their Cloud improving on the lessons learned from other public cloud providers.  They also quietly built out a robust set of services to support any and all use cases related to data and analytics.

Outlined below are the Top 10 Things You Didn’t Know about Data and Analytics in the Oracle Cloud:

1. Full Data Lake Capability – Either Hadoop-based or Object Storage-based – it is easy to quickly provision a data lake in the Oracle Cloud using either Hadoop/HDFS or Object Storage as the primary storage mechanism.

Oracle Big Data Service  – click here for more information about setting up a Hadoop-based data lake on Oracle Cloud Infrastructure (OCI)

Object Storage-based Data Lake – this is a recent Youtube video from Oracle demonstrating how to set up a data lake on OCI using object storage

2. Data Catalog – Oracle provides a data catalog service to allow easy access to all your data – regardless of location.   Whether it is in a data lake or a data warehouse, structrured or unstructured, in a relational database, in object storage or in Hadoop, the data catalog can help you keep track of your data assets.

OCI Data Catalog – this is a recent Youtube video from Oracle describing use cases for the OCI Data Catalog and how to set up OCI Data Catalog

3. Support for Streaming Data – the Oracle Cloud supports streaming data use cases via the OCI Streaming Service and Kafka Connect.  Perhaps you want to stream data from social media to perform sentiment analysis or you want to take in machine sensor data in real time to perform diagnostics and run machine learning models for predictive maintenance or you are a financial services company that wants to analyze high volume transactions in real time for fraud detection – OCI Streaming and Kafka Connect support these use cases and many more.  The OCI Streaming Service is fully managed so companies don’t have to worry about the complexity and operational burden of running all their data streams.

OCI Streaming Service and Kafka Connect – excellent Oracle blog on use cases, set up and benefits of OCI Streaming Service and Kafka Connect

Demo of Setting up OCI Streaming – short, recent youtube video explaning OCI Streaming with a demo on how to set it up (non Oracle video)

4. Serverless Spark Service – OCI Data Flow is a fully managed, serverless Spark service that lets you that lets you run Apache Spark applications with no infrastructure to deploy or manage.  You can run Spark jobs against your data in Hadoop or Object storage without worrying about provisioning a server and only pay for what you use.

OCI Data Flow Service – recent youtube video explaining OCI Data Flow Service

5. Big Data SQL Cloud Service Allows SQL Query Access Regardless of Underlying Storage – perhaps you have data in an object storage based Data Lake, some data in Hadoop/HDFS, some data in a NoSQL database and some data in a relational data warehouse and you want to use SQL to query across all those data sets using SQL -Oracle Big Data SQL Cloud Service will support that.

Oracle Big Data Cloud SQL – this is Oracle documentation on using the Big Data Cloud SQL service

6. World Class Cloud Data Warehouse – you have almost certainly heard of Snowflake (if only for its recent IPO) and you may have heard that Cloud data warehouses are a hot technology category.  You may not be aware that Oracle has a world class cloud data warehouse called ‘Autonomous Data Warehouse’ (ADW).  It is a full blown Oracle autonomous database that has been optimized for analytic workloads.  For instance the data is stored in a columnar manner on disk to support high performance analytic processing.  ADW can be provisioned easily, you pay for what you use, it runs on Exadata machines and supports autoscaling.

Autonomous Data Warehouse Technical Deep Dive – recent Oracle youtube video discussing the technical differentiators of ADW

7. Machine Learning Capability built into the Database – Oracle’s autonomous database includes 30+ machine learning algorithms that can be modified using Python or R.  Oracle’s mantra in this area is “move the algorithms, not the data”.   Previously, it was necessary to separately purchase the ‘Advanced Anaytics’ option to access the maching learning capabilities of the Oracle database, but now that is not necessary – all the machine learning, data mining and advanced analytics capabilities come with the base license/subscription for the Oracle database.  

Machine Learning in the Oracle Database – recent youtube video from Oracle explaning how machine learning works in the database – including how to use the built in notebook feature

Machine Learning in the Oracle Database – Short Summary – this is a 3 minute youtube video that quickly summarizes the basics of Oracle Machine Learning in the database

8. Data Science Platform for Professional Data Scientists – Does your company have an in-house team of professional data scientists whose job it is to extract value from the vast amount of data in the data lake and data warehouses?   The Oracle Cloud includes a data science platform with the tools and platforms most used by professional data scientists.  This platform also focuses on deploying and operationalizing ML models including ongoing tuning of the models.

OCI Data Science Platform – this is a playlist of 5 short videos explaning how to set up and use OCI Data Science platform

9. Oracle Data Integrator is Free in the Oracle Cloud Marketplace – Oracle Data Integrator (ODI) is a top-rated data integration and ETL platform.  It is used by some of the largest companies for their most complex ETL tasks.  ODI is on Oracle’s strategic roadmap and continues to be enhanced and supported.  ODI is currently free on the Oracle Cloud Marketplace. There is no license or subscription cost.  You will pay only for the Oracle Cloud compute that ODI consumes (and compute is very inexpensive in the Oracle Cloud – e.g., running a standard VM with 2 OCPU’s for 10 hrs/day will cost about $40/month or about $480 per year).

Oracle Data Integrator on the Oracle Cloud Marketplace

10. Prebuilt Analytics Leveraging Oracle Cloud SaaS Applications – this is a differentiator between Oracle and the other public cloud providers.  Unlike the other public cloud providers, Oracle has top-rated Cloud applications for ERP, Supply Chain Management (SCM), Human Capital Management (HCM) and Customer Experience (CX).  Oracle has developed “Fusion Analytics Warehouse” (FAW) which is a set of prebuilt analytic applications that run in the Oracle Cloud and work with Oracle’s Cloud SaaS applications.  Oracle has prebuilt a data pipeline to extract data from the Cloud SaaS applications into a Cloud-based data warehouse and has prebuilt KPI’s, reports and dashboards.  Fusion ERP Analytics was one of the first modules of FAW that was released and it works with Oracle’s Cloud ERP SaaS application.  For more information on Fusion ERP Analytics please see my blog titles “Best Practices for Implementing Fusion ERP Analytics”.

What is Fusion Analytics Warehouse – this is a youtube video from Oracle that introduces and explains Fusion Analytics Warehouse

Perficient’s Oracle Analytics practice is a team of seasoned, dedicated and passionate data and analytics professionals. They have worked with numerous clients to successfully extract value from their data and transform them into data-driven organizations.

 

 

 

]]>
https://blogs.perficient.com/2020/10/15/top-10-things-you-didnt-know-about-data-analytics-in-the-oracle-cloud/feed/ 0 281654
Best Practices for Implementing Oracle Fusion ERP Analytics https://blogs.perficient.com/2020/09/29/best-practices-for-implementing-oracle-fusion-erp-analytics/ https://blogs.perficient.com/2020/09/29/best-practices-for-implementing-oracle-fusion-erp-analytics/#respond Tue, 29 Sep 2020 16:02:40 +0000 https://blogs.perficient.com/?p=274398

Oracle Fusion Enterprise Resource Planning (ERP) Analytics is a module of Oracle Fusion Analytics Warehouse (FAW).  FAW was formerly known as ‘Oracle Analytics for Applications’ (OAX).  This name change is permanent and going forward we will use the following terms:

  • Fusion Analytics Warehouse (FAW) – this refers to the full suite of analytic applications that Oracle has developed to be used in conjunction with Oracle Cloud applications
  • Oracle Fusion ERP Analytics – this refers specifically to the module of FAW that pertains to Oracle Cloud ERP
  • Oracle Fusion Human Capital Management (HCM) Analytics – this refers to the module of FAW that pertains to Oracle Cloud HCM.

The modules of Fusion Analytics Warehouse have been developed by Oracle to work with Oracle Cloud SaaS applications – aka Fusion Applications (ERP, Supply Chain Planning/SCM, HCM, Customer Experience/CX).  Each module of FAW developed by Oracle contains the following components:

  1.  A data pipeline to extract data out of Oracle Cloud applications and load it into the FAW data warehouse
  2.  A prebuilt data warehouse data model that is instantiated in Oracle’s Autonomous Data Warehouse in the Oracle Cloud
  3.  Prebuilt report metadata (i.e., an RPD) for the KPI’s, reports and visualizations that come with FAW
  4.  Prebuilt KPI’s, reports and visualizations.

Here is the URL for Fusion Analytics Warehouse for further information:

https://www.oracle.com/business-analytics/fusion-analytics.html

Some Important Points to Remember About Fusion Analytics Warehouse

Oracle has a long history of success with packaged analytics applications.  They are continuing that tradition with FAW but with some important differences enabled by the Oracle Cloud platform.

  1. Oracle owns the data pipeline – yes, Oracle is taking responsibility for the error free functioning of the process of extracting data from Cloud ERP and loading it into FAW.  If you set up descriptive flexfields in Cloud ERP, they will be included in the extract.  Oracle is on the hook to make sure the data is fed correctly into FAW and that the delivered KPI’s, reports and visualizations work properly.
  2. FAW modules are fully extensible and customizable – it is possible to implement customizations at every layer of the FAW technology stack:
    • database
    • ETL (for adding non Oracle data sources or extending the Oracle-provided fact and dimension tables)
    • RPD
    • KPI’s, reports and visualizations.
  3. Never do an upgrade again – Oracle has taken on the responsibility to ensure that modules of FAW continue to work after updates and enhancements are made to the Oracle Cloud source applications.  Similarly, updates and enhancements made to FAW will be compatible with Oracle Cloud applications.  You might be thinking to yourself “Ok, that is great news actually – but you just said FAW was customizable.  What about the customizations that I make?  What happens to them when Oracle updates the Oracle Cloud source application? Will my customizations stop working?”.  The short answer is your customizations will continue to work.  The reason is that Oracle has provided a framework and set of wizards through which customizations to FAW will be made.  By forcing FAW customizations to be made in a certain manner, Oracle is able to guarantee that your customizations will survive upgrades to either the Oracle Cloud source applications or FAW itself.
  4. Focus on Key Performance Indicators (KPI’s) – with FAW, Oracle is focusing on KPI’s in addition to reports and dashboards.  This is helpful because the KPI’s allow instant access to critical information about the core underlying financial data without the users having to do the calculation themselves.  Each KPI ‘card’ has a built-in data element for a user-defined target.  The KPI card will indicate progress against the target to provide even more context.  In FAW, KPI ‘cards’ are combined together into ‘Decks’ (which you can think of as a dashboard – see image below which contains a Deck of KPI Cards).
  5. FAW is NOT the latest version of Oracle BI Applications – those that are familiar with Oracle BI Applications may be wondering if there is any relationship between FAW and BI Apps.  The short answer is no – FAW is a completely separate product line built to work with Oracle Cloud SaaS applications.  There is no upgrade path from FAW to BI Apps.  If you are running BI Apps and would like further information on your options, please review my whitepaper titled “A CIO’s Guide – Options for Existing Customers of BI Applications”.

KPI’s Prebuilt in Oracle Fusion ERP Analytics – Using Cloud ERP Data

Outlined below are the KPI’s prebuilt into Fusion ERP Analytics:

Kpis In Fusion Erp Analytics

Implementing Oracle Fusion ERP Analytics

Oracle Fusion ERP Analytics can be implemented into production in about 10 weeks.   The KPI’s and metrics in FAW are based on configurations and decision made during the implementation of Oracle Cloud ERP.   It is therefore important to make those Cloud ERP configuration decisions with an understanding of the KPI’s and metrics that are required as analytic output.  Perficient has developed a methodology that provides a roadmap for simultaneous deployment of Cloud ERP and Fusion ERP Analytics – highlighting and emphasizing the important areas for coordination.

When implementing Oracle Cloud ERP it is important to pay attention to the following values and how they are defined as they will impact the deployment of Fusion ERP Analytics.

  1. Financial Categories in Cloud ERP – The metrics for Fusion ERP Analytics are driven primarily by Financial Categories established in Fusion Cloud ERP.  Although simplifying a bit, GL natural accounts are mapped into Financial Categories and then those categories are used to calculate the metrics in Fusion ERP Analytics.  It is important to properly define the Financial Categories in Cloud ERP to be ensure the KPI’s and metrics in Fusion ERP are accurately calculated.
  2. Chart of Accounts Design – Segments and Values – during the load of data from Cloud ERP to Fusion ERP Analytics, all the segments and values of the Chart of Accounts (COA) are fed into the FAW data warehouse and are available for use in calculating and analyzing KPI’s and metrics. The COA must be designed with an eye towards the metrics that will be calculated and tracked in Fusion ERP Analytics.
  3. Financial Hierarchies in Cloud ERP – Financial hierarchies established in Fusion Cloud ERP can also be used to drive the calculation of KPI’s and metrics in Fusion ERP Analytics.  When defining the Chart of Accounts (COA) to be used in Cloud ERP, special attention should be paid to the hierarchies for each segment of the COA.  The hierarchies defined in Cloud ERP for the COA segments will have a direct impact on the analytic views that can be obtained from Fusion ERP Analytics.  While the hierarchies for all COA segments are important, the most important segment for calculation of KPI’s and metrics is the natural account segment.  We suggest the hierarchy for the natural account segment should not be finalized until it has been reviewed and validated in light of its ability to support the KPI’s and metrics in Fusion ERP Analytics.  A best practice is to build the hierarchy where all the financial statement line items are on the same level of the hierarchy.
  4. Descriptive and Extensible Flexfields in Cloud ERP (DFF’s and EFF’s) – in order for DFF’s and EFF’s to be available in Fusion ERP Analytics, they must be BI-enabled during deployment of ERP Cloud.   This is an option in Cloud ERP that must be selected.

Perficient’s Oracle Analytics professionals have deep expertise in leveraging the full portfolio of Oracle Analytics solutions to extract value from data and transform companies into data driven organizations.

]]>
https://blogs.perficient.com/2020/09/29/best-practices-for-implementing-oracle-fusion-erp-analytics/feed/ 0 274398
Why Fusion Analytics Warehouse (FAW) is the Platform for the New Norm of Today’s World https://blogs.perficient.com/2020/04/18/why-fusion-analytics-warehouse-faw-is-the-platform-for-the-new-norm-of-todays-world/ https://blogs.perficient.com/2020/04/18/why-fusion-analytics-warehouse-faw-is-the-platform-for-the-new-norm-of-todays-world/#comments Sat, 18 Apr 2020 19:37:37 +0000 https://blogs.perficient.com/?p=273321

In today’s data-driven world, speed and agility can be the deciding factor for survival in the marketplace and we know that data and analytics are the enablers of agility.  No longer can companies afford to maintain data and analytics platforms that can be outdated shortly after they are deployed.  Time cannot be spent on time-consuming and expensive upgrades. The modern data and analytics platform needs to be a living, breathing organism that is continuously improved, does not require traditional upgrades, and can rapidly respond to changes in analytical needs (the COVID-19 situation of our present-day has made that point in a big way).  Oracle has built such a platform in the Cloud-based Fusion Analytics Warehouse (FAW) suite.

For those that are not aware, FAW is a Cloud-based data and analytics platform developed by Oracle, leveraging Oracle Autonomous Data Warehouse (ADW) and Oracle Analytics Cloud (OAC), that is built to work ‘hand-in-glove’ with Oracle’s Cloud SaaS applications like Enterprise Resource Planning (ERP) Cloud, Human Capital Management (HCM) Cloud, Supply Chain Management (SCM) Cloud and Customer Experience (CX) Cloud.

FAW consists of a suite of analytics modules for ERP, HCM, SCM, and CX Cloud.  Each module includes the following:

  1. prebuilt data pipeline to extract data from the Oracle Cloud SaaS application
  2. a prebuilt database data model
  3. prebuilt semantic layer for support of analytics and reporting
  4. prebuilt metrics and KPI’s
  5. prebuilt reports and dashboards.

Please see the link below for information from Oracle’s website:

https://www.oracle.com/business-analytics/fusion-analytics.html

Important Benefits of Fusion Analytics Warehouse (FAW):

  1. Extensions and Customizations are Expected in FAW – While Oracle is providing about 50 or so ‘out of the box’ metrics, Oracle does not know exactly how each company looks at their business, calculates financial metrics, analyzes their supply chain, etc. Oracle is, therefore, providing easy-to-use tools to extend and customize the out of the box data model, semantic layer, metrics, and reports.
  2. Pre-built Data Pipeline Saves Time and Cost – this is a huge time and cost saver! No need to worry about understanding all the tables and views in Oracle Cloud apps and how to extract data from them.  That has been done for you.
  3. Pre-built Data Model and Semantic Layer Shifts Focus to Analytics – another huge time saver! It allows you to worry about your metrics and the value your analytics can bring. No need to worry about complex modeling of core ERP, SCM, HCM, or CX data.
  4. FAW as Enterprise Data and Analytics Platform – FAW can be extended to include non-Oracle data sources. Those data sources can be loaded into a different schema in the same data warehouse as Oracle provides with FAW to store the data of Oracle Cloud SaaS apps.  Also, those non-Oracle data sources can share the same semantic model as Oracle Cloud SaaS apps for reporting and analysis. For companies where a good bit of their enterprise data is running through Oracle Cloud SaaS applications, this makes a lot of sense.
  5. Never Perform an Upgrade Again! – FAW is Cloud-based so Oracle can roll out updates and enhancements automatically without requiring customers to perform an upgrade process. You might ask “what if I have made customizations”? Oracle is providing wizards to allow your customizations to be made in a way that will survive upgrades!
  6. Machine Learning and Predictive Analytics Built In – FAW already includes machine learning and predictive analytics but because FAW is a Cloud-based platform, Oracle can and will continuously update the analytics modules with additional machine learning and predictive analytics capabilities.

As a Cloud-based data and analytics platform that is purpose-built for integration with Oracle Cloud SaaS apps, FAW allows companies to maximize their Cloud investment while also deploying a data and analytics platform that supports continuous improvement and that can quickly adapt to changes in analytical needs – without an expensive and time-consuming upgrade.

Perficient’s Oracle Analytics team has deep expertise in Oracle Analytics, including FAW, and has helped numerous companies define and deploy Oracle Cloud-based data warehouses and analytic systems.

]]>
https://blogs.perficient.com/2020/04/18/why-fusion-analytics-warehouse-faw-is-the-platform-for-the-new-norm-of-todays-world/feed/ 2 273321
Turbocharge Your ERP Analytics with Oracle Analytics for Applications https://blogs.perficient.com/2020/02/18/turbocharge-your-erp-analytics-with-oracle-analytics-for-applications/ https://blogs.perficient.com/2020/02/18/turbocharge-your-erp-analytics-with-oracle-analytics-for-applications/#respond Tue, 18 Feb 2020 23:49:36 +0000 https://blogs.perficient.com/?p=251020

In late 2019 Oracle released Oracle Analytics for Applications (OAX). OAX is built on Oracle Analytics Cloud (OAC) and powered by Oracle Autonomous Data Warehouse (ADW). It provides personalized application analytics, benchmarks, and machine learning-powered predictive insights across all line-of-business job functions and business processes for Oracle Cloud Applications. It consists of prebuilt analytic applications that include a data pipeline, data warehouse and prebuilt KPI’s, metrics, reports, and dashboards. OAX is purpose-built for Oracle’s Cloud SaaS applications: ERP Cloud, HCM Cloud, SCM Cloud, and CX Cloud. The first module released for OAX was Oracle Analytics for Fusion ERP (OAF) which focuses on financial analytics. Additional modules for HCM Cloud, SCM Cloud, and CX Cloud will follow.

In today’s data-driven world, no process can be considered complete or fully optimized without addressing the analytics related to the process. Although OAF can be deployed at any time – either during the ERP Cloud implementation or after – there are significant benefits to be realized from deploying OAF concurrent to the ERP Cloud deployment. Some of these benefits are:

  1. Improved design for ERP Cloud – especially in the areas of Chart of Accounts and flexfield design
  2. Improved ROI of ERP Cloud deployment through better analytics
  3. Increased visibility of ERP Cloud deployment by providing key metrics, KPI’s, and dashboards to senior executives and business managers
  4. Take advantage of the machine learning and predictive analytics available within OAF
  5. Leverage economies of scale from common team members for ERP and Analytics
  6. Distribute critical KPI’s and metrics via mobile devices
  7. Create a data foundation for future analytics and leveraging of machine learning

Oracle Analytics for Applications Goes Beyond Transactional Reporting

OAF complements Oracle Transactional Business Intelligence (OTBI-which is part of ERP Cloud) and addresses the need for analytics beyond transactional reporting. Whereas OTBI is focused on transactional reporting with limited historical reporting and works with small volumes of data sourced from Oracle Fusion apps only, OAF is geared towards strategic and advanced analytics, supporting deep historical analysis, non-Oracle Cloud data sources and large volumes of data. OAF helps executives and decision-makers improve business performance and gives analysts the tools to uncover hidden insights.

OAF includes 15 subject areas and over 100 data warehouse tables covering General Ledger, Accounts Payable and Accounts Receivable. It comes with over 50 Financial KPI’s that are prebuilt to leverage the data in your ERP Cloud system. With OAF you can do the following:

  1. Quickly access critical Financial KPI’s and metrics with comparison to target values
  2. Leverage machine learning to drive alerts and discover relationships among metrics
  3. Analyze historical trends
  4. Gain insights into drivers of profitability
  5. Understand your cost structure and expenditure patterns
  6. Automatically gain access to key Financial ratios like ROE, ROA, Debt to Equity, Current Ratio and more
  7. Extend existing KPI’s and metrics or define your own KPI’s
  8. Drill directly from KPI’s and metrics into underlying detail
  9. Take immediate action on insights by linking back into the Cloud SaaS applications to execute transactions (e.g., purchase more inventory)
  10. Integrate non-Oracle Cloud data sources for true enterprise-level analytics

Oracle ERP Cloud and Oracle Analytics for Applications represent a powerful combination delivering an integrated platform for flawless business process execution and deep, machine-learning powered analytics to drive improved business performance and continuous improvement. Perficient’s credentialed Oracle expertise and team of highly skilled implementation specialists can guide you on your journey to world-class business process execution and analytics.

]]>
https://blogs.perficient.com/2020/02/18/turbocharge-your-erp-analytics-with-oracle-analytics-for-applications/feed/ 0 251020
Oracle Named Visionary for Analytics and BI in Gartner Magic Quadrant https://blogs.perficient.com/2020/02/11/oracle-named-visionary-for-analytics-and-bi/ https://blogs.perficient.com/2020/02/11/oracle-named-visionary-for-analytics-and-bi/#respond Tue, 11 Feb 2020 20:50:38 +0000 https://blogs.perficient.com/?p=250766

It was recently announced that Oracle Analytics Cloud (OAC) has been named Visionary in the 2020 Gartner Magic Quadrant for Analytics and Business Intelligence Platforms.   This is great news as it validates what we have been seeing in the marketplace for quite some time.  It comes as no surprise to firms like Perficient who’ve been helping clients design and deploy Oracle Analytics Cloud (and its Cloud-based precursor) for the last 4-5 years.   We have seen many clients, with very different analytical needs, from departmental to enterprise-level, get great benefits from OAC.

Best of Both Worlds: Enterprise Governance AND Self Service

For a while now, OAC has been the premier analytics platform for combining enterprise governance and control with world-class self-service capabilities and advanced visualizations.  The analytics world has realized it is not “either/or” when it comes to enterprise governance and self-service – it is both.  With OAC, it is possible to connect to any supported data source and immediately begin to query the data without any modeling.   Advanced visualizations are available with one click and can be swapped in and out quickly by selecting from a palette of visualization choices.  On the other hand, OAC’s enterprise semantic modeling capabilities make it possible to provide a curated and governed user experience for critical metrics where ‘single source of truth’ issues cannot be tolerated.   The security model for OAC is very mature allowing integration with on-premises or cloud-based single sign-on providers and definition of permissions by role and by type of data.

Augmented Analytics and Machine Learning

With OAC, Oracle has embraced the concept of augmented analytics whereby machine learning is built into the platform to enhance and improve the capabilities of business analysts.  With a single click, the OAC platform will explain key drivers of attributes, identify outliers, provide recommendations for improving a data set or generate a forecast.  OAC also provides prebuilt machine learning algorithms that can be used for predictive analytics and other data science use cases without any coding.  In order to leverage data to drive competitive advantage, we are seeing many companies take advantage of OAC’s no code machine learning capabilities to improve their analytics and gain new insights without having to hire a full team of data scientists.

Mobile and Natural Language Query

According to many experts, the predominant user interface of the future for analytics will be our voice and, increasingly, we will be speaking to some form of a mobile device.  With OAC, the future is now as it natively supports mobile devices without any additional set up.  Natural language is also supported in OAC as it is possible to speak naturally into your mobile device to request the analytics that you want.  Additionally, OAC will generate a natural language description of any visualization with a single click (which can be very helpful with complex visualizations).

If you would like further information about some of the great features of OAC please see my recently published blog.

Perficient’s OAC-based Prebuilt Solution Templates

Having seen the value of OAC, Perficient decided to create a number of prebuilt solution templates that leverage both OAC and Oracle’s Autonomous Data Warehouse (ADW) as their foundation architecture.  OAC is tightly integrated with ADW and together they provide a comprehensive analytic solution at any scale – from departmental level to enterprise, petabyte-scale data warehouses.

Perficient’s prebuilt solution templates allow companies to get up and running quickly on OAC and provide speed to value to companies looking to improve their analytics without long custom development project cycles.  Prebuilt solution templates based on OAC and ADW have been created for the following functional areas:

  1. Finance
  2. HR
  3. Sales
  4. Service
  5. Marketing
  6. Procurement
  7. Supply Chain.

Whatever your data and analytics environment looks like today, you can be sure Perficient has seen it before.  We can help you identify your critical KPI’s and metrics, untangle your data integration issues and develop your future state data and analytics roadmap.

]]>
https://blogs.perficient.com/2020/02/11/oracle-named-visionary-for-analytics-and-bi/feed/ 0 250766
Using No Code ML in Oracle Analytics Cloud to Predict Housing Prices https://blogs.perficient.com/2020/01/15/using-no-code-ml-in-oracle-analytics-cloud-to-predict-housing-prices/ https://blogs.perficient.com/2020/01/15/using-no-code-ml-in-oracle-analytics-cloud-to-predict-housing-prices/#respond Wed, 15 Jan 2020 19:31:37 +0000 https://blogs.perficient.com/?p=249705

First a quick summary of machine learning (ML).  At a high level and simplifying a bit, there are basically two types of ML:

  • Supervised learning – a labeled data set is used to train an ML model to make predictions. The ‘trained’ ML model is then applied against a data set to make the predictions it has been trained to make. There are two types of supervised learning 1. classification where a non-numeric prediction is made (e.g., a person will leave the company or not) and 2. regression where a prediction is made of a value that is on a continuum (e.g., housing prices).
  • Unsupervised learning – the data that is run through the ML model is not labeled. The ML model is used to find patterns and clusters in the data that otherwise would be very hard to detect.

In today’s blog post I will demonstrate how to use the machine learning capability in Oracle Analytics Cloud to predict housing prices. This is an example of supervised learning and regression because the ML model will be trained using a data set with housing prices (i.e., a labeled data set). I will do so in a step by step manner so you can follow along and try it yourself.

At a high level, here are the steps we will cover:

  1. Obtain the labeled training data set for housing prices and upload it to OAC.  Slightly modify the data set before uploading it to OAC.
  2. Use the labeled training data set to train a numeric prediction ML model that is provided with OAC.
  3. Evaluate the trained numeric prediction ML model and analyze the drivers of prediction.
  4. Apply the trained ML model to a housing data set to predict housing prices.
  5. Analyze the predicted housing prices using OAC.

Step 1: Obtain the labeled training data set and upload it to OAC.  Slightly modify the file before uploading to OAC

For this exercise, we will use a publicly available data set of Boston housing prices.  This data set is available for download from Kaggle:

https://www.kaggle.com/puxama/bostoncsv/data

The Kaggle site does not include a description of what the columns mean.  You can use the URL below to get a description of what each column in the Boston housing data set means:

http://math.furman.edu/~dcs/courses/math47/R/library/mlbench/html/BostonHousing.html

Before uploading to OAC, we will make the following slight changes in Excel:

  1. We will add the header of ‘House ID’ to column A.

  2. The column titled ‘medv’ (ie median value) has been rounded to the 000’s on the downloadable file.  We will multiply that column by 1000 to remove the rounding.

To upload the file to OAC, click on the Create button in the upper right-hand corner, then click on ‘Data Set’.  Find the file on your hard drive and upload it to OAC.

Create Data Set To Upload File

Next, we will change House ID to be treated as an attribute and not a measure so we can report against it.  After initially uploading your file, simply click on the House ID column and then, on the left-hand side where it says ‘Treat As’, change the value to attribute.  Then click ‘Add’ to add the file as a data set.

At this point, the Boston housing data set is available to be used as a training data set for our machine learning predictive model.

Step 2: Train the machine learning predictive model using the Boston housing dataset

To train the ML model in OAC we need to create a data flow.  So click on the ‘Create button’ in the upper right-hand corner and then click ‘Data Flow’.  You will be presented with a screen that asks you to pick a data set to be used by the data flow.  Please pick the Boston housing data set that you just uploaded.  After selecting your uploaded Boston housing data set, you will be presented with a screen like the one below:

Data Flow First Screen For Training Model

Now it is time to select the model that we will train.  Use the scroll bar on the left to scroll down to see the options available for machine learning models.  Click on ‘Train Numeric Prediction’ and drag it next to the blue ‘Boston Housing’ data set symbol.  You will see a green plus sign and then you will get the screen below:

Selecting The Numer Prediction Model To Use

Let’s select ‘Linear Regression for model training’.  Click OK.

The next step is important. This is where we pick the target column which is the column we want to predict. Click on ‘Select a column’ next to ‘Target’. Select ‘medv’ from the columns displayed. ‘medv’ stands for ‘Median Value’ and this is the value we want to predict. For the other parameters, you can leave the default values. As you scroll down you will see the value called ‘Train Partition Percent’ has defaulted to 80. This is very common and means that 80% of the data in the Boston housing data set will be used to train the model. The remaining 20% will be used to test the model (i.e., in terms of predicting housing prices).

Selecting The Target Column

Next, we will click on the ‘Save Model’ symbol in the data flow.  We will be prompted to give the model a name.  Please call the model whatever you like.  Next, we need to save the data flow before we can run it to train the model.  Click on ‘Save’ in the upper right-hand corner and give the data flow whatever name you like.  Then click on ‘Run Data Flow’ (which is right next to ‘Save’) to train the model and to create a model that can then be applied to other data sets.  To see the model that you just created, click on the ‘hamburger’ in the upper left corner and select Machine Learning from the drop-down.  I called my model “Numeric prediction model based on the Boston housing data set’.

Ml Model

Step 3: Review the trained numeric prediction model.  Analyze the key drivers

On the screen above, select your model and go to the right-hand side.  You will see an ‘Actions menu’ appear.  Click on the ‘Actions menu’ and select ‘Inspect’ to evaluate the model.  You will see the following screen:

Model Evaluatoin Screen

Select ‘Quality’ to analyze the accuracy of the model.  On the screen below, we see that the Coefficient of Determination or R Squared is 70% which is generally considered good.

Coefficient Of Determination

Now let’s take a look at what the main drivers are for the prediction of housing prices according to this model.  Click on ‘Related’ to get the screen below:

Related Screen

On the screen above, we will click on the first set of generated data called the “Numeric Prediction model based on the Boston housing dataset. Drivers”.  That will bring up the data set generated by the model creation process that outlines the main drivers.  Click on ‘Visualize’ in the top right-hand corner to analyze the data.  Hold down CTRL and select ‘Driver Name’, ‘Coefficient’ and ‘Correlation’ and drag them onto the visualization canvas.  If you select a visualization type of vertical bar chart, you will see the following main drivers (sorted by Coefficient high to low):

Main Drivers Of Housing Price Prediction

The number of rooms (rm) and whether the house is on the Charles River or not (chas) are strongly positively correlated with the housing price (as these go up so does the price of the house).  On the other side, we can see that the pupil-teacher ratio (ptratio) and % of the population of lower status (lstat) are negatively correlated with housing prices.

Step 4: Apply the machine learning predictive model we just trained to a data set to predict housing prices

Now we are ready to apply our trained ML model to predict housing prices. One thing to keep in mind is that the data set to which the ML model will be applied needs to have the same inputs (i.e., columns) as the trained ML model. To achieve this, we will apply our trained ML model to the original Boston housing data set.

To apply an ML model we need to create a data flow in OAC.  Click on ‘Create’ in the upper right-hand corner and then click on ‘Data Flow’.  Click on your Boston housing data set to add it to the data flow:

Apply Model Add Data SetUse the scroll bar on the left-hand side to scroll down until you see the ‘Apply Model’.  Drag ‘Apply Model’ onto the plus sign to the right of the Boston Housing data set.  Select the machine learning model we just created and trained (in my case it will be the model called “Numeric Prediction model based on the Boston Housing data set).  Select OK.

Selecting Model To Apply It

Now we need to give the column that will contain the predicted value a name and also save the data set that will be created when we run the model.  The column name for the predicted value defaults to “PredictedValue” and we will leave it that way.  Use the scroll bar on the left to scroll back up until you see ‘Save Data Set’.  Drag ‘Save Data Set’ next to the ‘Apply Model’.  Give the new data set a name, save it to data set storage and change House ID to be treated as an attribute.

Save The Predictedvalue Data Set

Lastly, we need to save the data flow and then run it to create the new data set with the predicted value.  As we did before, please click on ‘Save’ in the upper right-hand corner and give the data flow whatever name you like. Then click on ‘Run Data Flow’ to apply the model to the Boston housing data set to predict housing prices. This will create a data set that we will analyze in the next step.

Step 5: Analyze the predicted values

Click on the ‘hamburger’ icon in the upper left-hand corner and then click on ‘Data’ to find your data set with the predicted value.  Once you find your data set, click on the ‘Actions Menu’ for your data set and select ‘Create Project’ (my data set is called ‘Housing Prices with Predicted Value’):

Create Project Using Predicted Value Data SetSelect ‘PredictedValue’, ‘medv’ and ‘House ID’ and select ‘Scatter’ as the visualization type.  You will see the predicted value for each House ID compared to the original median value.  Although there are some outliers the majority of the predictions are close to the original house value:

Data Viz For Predicted Value

I encourage you to explore the machine learning capabilities of Oracle Analytics Cloud.  Have fun!

]]>
https://blogs.perficient.com/2020/01/15/using-no-code-ml-in-oracle-analytics-cloud-to-predict-housing-prices/feed/ 0 249705
My Top 5 Favorite Features of Oracle Analytics Cloud https://blogs.perficient.com/2019/12/30/my-top-5-favorite-features-of-oracle-analytics-cloud-oac/ https://blogs.perficient.com/2019/12/30/my-top-5-favorite-features-of-oracle-analytics-cloud-oac/#respond Tue, 31 Dec 2019 03:38:25 +0000 https://blogs.perficient.com/?p=249337

Oracle Analytics Cloud (OAC) is Oracle’s cloud-based platform for reporting, analytics, data exploration, and data visualization. It encompasses many capabilities and multiple products. As a result, it was hard to limit the post to just 5 features given that there is so much to choose from. That being said, the following are my top 5 favorite features of Oracle Analytics Cloud:

1. Connect to almost anything; Query immediately without modeling

I appreciate the need for enterprise data governance and how OAC enables such governance through its enterprise semantic model. But, in today’s fast-based world where analytic needs change quickly, it is great to know that OAC can connect to virtually any data source and start querying the data immediately. As a result, there is no modeling required.  This is great for data exploration and for departmental data marts where the underlying data structure is not too complex.  It is also great for data science teams who need to review data quickly to determine its usefulness.

Please see the following link for a list of supported data sources for OAC:

https://docs.oracle.com/en/cloud/paas/analytics-cloud/acubi/supported-data-sources.html

2. Easy, Advanced Visualizations

With OAC, one click of the mouse is all it takes to select an advanced visualization from among many visualization choices. And if you are not happy with your initial selection, a new visualization type is one click away. I love this feature because using it makes you a better data analyst. Since it is so easy to visualize your data using multiple different visualization types, through trial and error you quickly get very good at determining what is the best visualization type for your data and the message you want to send.

3. Self Service Data Preparation

This feature really sets OAC apart from other business intelligence tools and analytics platforms. With OAC’s self-service data prep it is now possible for business analysts to prepare, load, transform and blend data in ways that would have required IT assistance and weeks or months of work just a few years ago. The data prep module is augmented with machine learning, so it will catch things in the data and suggest improvements.  I worked in Corporate IT for many years but before that, I worked in the business and did a good bit of self-service data wrangling.  I love this feature and wish that it had existed 10-15 years ago!  just know that it will add efficiency to many departments.

4. Easy Integration with Autonomous Data Warehouse (ADW)

This is another feature that didn’t exist just a few years ago on any vendor’s platform. Now it is possible with just a few clicks to establish a connection to ADW directly from OAC – without needing the help of IT or a DBA.  Once the connection to ADW is established, you load data into ADW directly from OAC –  with no help required from a DBA.  This is hugely empowering for users of all kinds.  In the past, there was a need for data integration and analysis but no desire or funds for a project with IT. A departmental data mart was created often using MS Access or Excel, and this data would not get integrated with the rest of the corporate data. Now with OAC and ADW integration, data and analytics needs can be quickly satisfied without necessarily creating additional data silos.

5. Machine Learning (ML)

Machine Learning takes two forms in OAC. First, it is built into the platform and augments the capabilities of the platform behind the scenes without the user knowing that machine learning (ML) is being applied. For instance, OAC will provide recommendations to enrich a data set that has been uploaded. ML generates statistics about data sets.  With a single click, OAC will tell you what values are key drivers of critical attributes in your data sets and will also point out anomalies.

These examples both use ML behind the scenes.  The second form of ML usage in OAC is more explicit and driven by the end-user.  OAC provides approximately 15 ML algorithms that can be used for advanced analytics.  You do not have to be a data scientist to use the algorithms.  A ‘regular business analyst’ can use the algorithms to do things like predictive analytics and customer segmentation.  I love this feature because it makes ‘AI’ accessible to everyone.

Bonus Feature: Natural Language Generation

I know I said just 5 but there is one other feature that I have to mention. With OAC and ML, visualizations can be rendered into plain English. It does a remarkably good job of breaking down even the most complex visualizations into easy to understand bullet points. This provides another vehicle for increasing understanding.

What are your favorite features of OAC?

You can read more insights about Oracle here.

]]>
https://blogs.perficient.com/2019/12/30/my-top-5-favorite-features-of-oracle-analytics-cloud-oac/feed/ 0 249337
Oracle Analytics Cloud and Autonomous Data Warehouse – Better Together https://blogs.perficient.com/2019/11/08/oracle-analytics-cloud-and-autonomous-data-warehouse-better-together/ https://blogs.perficient.com/2019/11/08/oracle-analytics-cloud-and-autonomous-data-warehouse-better-together/#respond Fri, 08 Nov 2019 21:27:44 +0000 https://blogs.perficient.com/?p=246768

Perficient Presents at Oracle OpenWorld 2019.  Oracle Analytics Cloud (OAC) and Oracle Autonomous Data Warehouse (ADW) are setting the standard for cloud-based data warehouse and analytics deployments with respect to speed to value, flexibility, performance, self service and advanced capabilities like AI and natural language queries.   If you are thinking about moving all or some of your data and analytics environment to the cloud, you should watch this short video (and btw that should include almost everyone 😉).

Oracle has built tight integration between OAC and ADW.  In this video you will learn about solution patterns for rapid deployment of OAC and ADW including how to load data into Autonomous Data Warehouse using only Oracle Analytics Cloud with no coding or DBA required!  To learn more, watch the video now!

Whatever your data and analytics environment looks like today, you can be sure our seasoned team of analytics professionals has seen it before.  We can help you identify your critical KPI’s and metrics, untangle your data integration issues and develop your future state data and analytics roadmap.

]]>
https://blogs.perficient.com/2019/11/08/oracle-analytics-cloud-and-autonomous-data-warehouse-better-together/feed/ 0 246768
The 90s Called – They Want Their Data Culture Back https://blogs.perficient.com/2019/10/14/the-90s-called-they-want-their-data-culture-back/ https://blogs.perficient.com/2019/10/14/the-90s-called-they-want-their-data-culture-back/#comments Mon, 14 Oct 2019 17:04:43 +0000 https://blogs.perficient.com/?p=242254

The 1990s

Ah, yes, the 90s.  I can hear the sweet, gentle pinging of my external, dial-up modem as I write this. For those of you that were not in the workforce in the 90s I will describe them a little bit.  Information was a highly guarded commodity.  Decisions were made at the top of an organization and then were slowly communicated downward through hierarchical, command and control structures – often times via physical memos that were shuffled from physical inbox to physical inbox (yes, really). Although most companies adopted email on their own networks at some point during the 90s, I knew executives who had their administrative assistants print out their emails and type in their responses.  It was a sign of your power and position to be able to proudly proclaim that you ‘refused to use email’.  So email was a step in the right direction but it did not change the prevailing mindset. (For the record I was relatively new in the work force at the time and I was definitely reading and writing my own emails!) Yes, data and information meant power. The more data and information you had the more powerful you were.  If the data was difficult to interpret and you were required to be physically present to explain what it all meant, then that was even better.  More power and control!  It was in the 90s that we began to see the rise of the one-off, ‘server under the desk’, departmental MS Access database with all the important information for the executive running the department.  These data troves were jealously guarded and were rarely spoken of publicly.  No one in their right mind was freely sharing data and information. Why give away such a valuable commodity for free? Where was the payoff? There was none.

Genome for Today’s Data Culture

What I have just described is the genome for today’s data culture.  Sharing of data and information is simply not in the DNA of today’s data culture. If we ran today’s data culture through Ancestry.com, the one thing we know we won’t find is a picture of a long lost relative getting an award for sharing data with colleagues and enforcing data standards. 23andme, the DNA testing company, would probably find a ‘data hoarding’ gene rather than a ‘data sharing’ gene.  The fact of the matter is that in too many organizations today, sharing of data is not valued and information is still power (and no one freely gives away power). The culture of data and data sharing in companies has simply not kept pace with advancements in data management and analytics technologies.

So why is this important and what does culture have to do with it?

Importance of Data Culture

Culture is what silently influences the behavior of people without their conscious awareness. Culture is ‘how we do things around here’. Culture is ‘how we have always done it’.  Culture reflects the values of an organization. Culture lets you know how you can behave and be sure that no one will call you out over it.  Culture is a belief and behavior system that we all agree to follow either explicitly or implicitly. Not following the culture can lead to being ostracized and for most of human history that meant death (and we as humans are not very big on death).

“Culture eats strategy for breakfast, lunch and dinner” Peter Drucker

This quote is attributed to Peter Drucker and speaks to the fact that no amount of strategic pronouncements or data governance initiatives or cross department big data teams can overcome the baseline culture of an organization.

How to Assess Data Culture

Is your data culture stuck in the 90s? Here are some questions to help assess your data culture:

  1. Is there any institutional incentive to share data, enforce data standards and work to continually improve data quality (eg bonuses, public recognition, etc)?
  2. How is power and responsibility obtained and retained in the organization?  Does holding onto data and ignoring data standards help or hurt a person advance and gain power?
  3. How are heros and heroic efforts defined and celebrated internally?  If a team or person ‘saves the day’ at the last minute by working the weekend to scrub, normalize and consolidate bad data are they celebrated as heros and that’s it or is there also a recognition that something is fundamentally wrong with the process?
  4. Has the importance of data made its way into the mythology and lore of the organization?  Are there any good stories that get told and retold about how attention to data and data sharing had a positive impact on the company (eg leapfrogged the competition, radically changed customer experience, developed new products, saved company from irrelevance, etc)?
  5. What type of decision making is respected in the organization – following your gut regardless of the numbers or decisions based on facts and numbers?
  6. What messages are senior executives sending to their teams with their behavior and attitude toward data?
  7. How is loyalty defined in the organization?  Is it a betrayal to share too much data and information with other groups?
  8. Are internal groups competing when really they should be collaborating?
  9. Is there a Chief Data Officer?  To whom does this person report?
  10. Are your executives asking their admins to print out their emails and type in their responses? (OK, just kidding about this last question-but if the answer is yes we really need to stage an intervention for you)

The business world has now realized the massive, transformative potential that lies in leveraging all an organization’s data as well as public and competitive data. There are many success stories about brand new business models emerging, customer experiences being transformed, new, more targeted products being developed and quality being improved through the innovative use of data. However, for each success story there are many organizations that are not getting the results they expected from their data initiatives or things are just not moving fast enough.  In such cases, data culture could be a big part of the issue.

 

 

 

 

]]>
https://blogs.perficient.com/2019/10/14/the-90s-called-they-want-their-data-culture-back/feed/ 2 242254
Options for Existing Customers of Oracle BI Applications https://blogs.perficient.com/2019/10/03/options-for-existing-customers-of-oracle-bi-applications/ https://blogs.perficient.com/2019/10/03/options-for-existing-customers-of-oracle-bi-applications/#respond Fri, 04 Oct 2019 04:34:25 +0000 https://blogs.perficient.com/?p=245295

In 2007, Oracle Business Intelligence (BI) Applications became generally available.  They were an immediate success and have enjoyed great popularity ever since.  Although Oracle BI Applications are now considered ‘content complete’ and Oracle no longer sells new licenses, Oracle is committed to supporting this large and important customer base.  That said, many existing customers are not clear on their options for moving forward, they are unclear about key dates for support, they don’t know if they should stay on prem, move to the cloud or pursue a hybrid approach.  They don’t know what is involved in an upgrade or how to estimate the effort and they don’t know what options are available should they choose to replace the BI Apps.  To answer these and other questions we have created an interactive ebook for existing customers of BI Applications.

In this interactive ebook we discuss the following topics:

1. The Background of Oracle BI Applications – when they were developed, what the major changes have been over the years and through the different versions

2. Go-Forward Strategy Considerations – we list a number of important items to take into account when developing a go-forward strategy

3. Go-Forward Options – we list out and describe in detail all the options available to existing customers of BI Applications covering on-premises, cloud and hybrid approaches

4. Perficient’s Prebuilt Solution Templates – we introduce and describe the prebuilt solutions developed by Perficient for key functional areas such as Finance, HR, Sales, Marketing, Service, Supply Chain and Procurement.  These prebuilt solutions can be deployed on prem or in the cloud and support both on prem and cloud data sources.  We discuss how some customers have used these as a replacement for BI Apps.

5. How Perficient Can Help – we outline the services we offer to help customers analyze their situation, pick the best go-forward strategy and implement their desired choice.

Please register below to download the guide and get a complimentary consultation.

]]>
https://blogs.perficient.com/2019/10/03/options-for-existing-customers-of-oracle-bi-applications/feed/ 0 245295