oracle analytics Articles / Blogs / Perficient https://blogs.perficient.com/tag/oracle-analytics/ Expert Digital Insights Tue, 04 Nov 2025 15:16:40 +0000 en-US hourly 1 https://blogs.perficient.com/files/favicon-194x194-1-150x150.png oracle analytics Articles / Blogs / Perficient https://blogs.perficient.com/tag/oracle-analytics/ 32 32 30508587 AI Assistant Demo & Tips for Enterprise Projects https://blogs.perficient.com/2025/05/15/ai-assistant-demo-tips-for-enterprise-projects/ https://blogs.perficient.com/2025/05/15/ai-assistant-demo-tips-for-enterprise-projects/#respond Thu, 15 May 2025 13:04:24 +0000 https://blogs.perficient.com/?p=381416

After highlighting the key benefits of the AI Assistant for enterprise analytics in my previous blog post, I am sharing here a demo of what it looks like to use the AI Assistant. The video below demonstrates how a persona interested in understanding enterprise projects may quickly  find answers to their typical everyday questions. The information requested includes profitability, project analysis, cost management, and timecard reporting.
A Perficient Demo of AI Assistant for Project Analytics

What to Watch Out For

With the right upfront configuration in place, the AI assistant, native to Oracle Analytics, can transform how various levels of the workforce find the insights they need to be successful in their tasks. Here are a few things that make a difference when configuring the AI Assistant.

  • Multiple Subject Areas: When enterprise data consists of several subject areas, for example Projects, Receivables, Payables, Procurement, etc., performing Q&A with the AI Assistant across multiple subject areas simultaneously is not currently possible. What the AI Assistant does in this situation is prompt for the subject area to use for the response. That is not an issue when the information requested is from a single subject area. However, there are situations when we want to simultaneously gain insights across two or more subject areas. This can be handled by preparing a combined subject area that contains the key relevant information from other underlying subject areas. As a result, the AI Assistant interfaces with a single subject area that consists of all the transaction facts and conformed dimensions across the various transactional data sets. With a little semantic model adjustments this is an achievable solution.
  • Be selective on what is included in AI prompts: Enterprise semantic models typically have a lot of information that may not be relevant for an AI chat interface. Therefore, excluding any fields from being included in an AI prompt improves performance, accuracy, and sometimes even reduces the processing cost incurred by AI when leveraging external LLMs. Dimension codes, identifiers, keys, and audit columns are some examples of things to exclude. The Oracle Analytics AI Assistant comes with a fine-grained configuration that enables selecting the fields to include in AI prompts.
  • Metadata Enrichment with Synonyms: Use synonyms on ambiguous fields, for example to clarify what a date field represents (Is it the transaction creation date or the date it was invoiced on?). Another example of when synonyms are useful is when there is a need to enable proper interpretation of internal organization-specific terms. The AI Assistant enables setting up synonyms on individual columns to improve it’s level of understanding.
  • Indexing Data: For an enhanced user experience, I recommend identifying which data elements are worth indexing. This means the AI LLM will be made aware of the information stored in these fields that you chose while setting up the AI Assistant. This is an upfront one-time activity. The more information you equip the AI Assistant with, the smarter it gets when interpreting and responding to questions.

For guidance on how to get started with enabling GenAI for your enterprise data analytics, reach out to mazen.manasseh@perficient.com.

]]>
https://blogs.perficient.com/2025/05/15/ai-assistant-demo-tips-for-enterprise-projects/feed/ 0 381416
A Closer Look at the AI Assistant of Oracle Analytics https://blogs.perficient.com/2025/05/09/a-closer-look-at-the-ai-assistant-of-oracle-analytics/ https://blogs.perficient.com/2025/05/09/a-closer-look-at-the-ai-assistant-of-oracle-analytics/#respond Fri, 09 May 2025 13:43:00 +0000 https://blogs.perficient.com/?p=381155

Asking questions about data has been part of Oracle Analytics through the homepage search bar for several years now. It did that with Natural Language Processing (NLP) to respond to questions with various automatically generated visualizations. What has been introduced since late 2024 is the capability to leverage Large Language Models (LLM) to respond to user questions and commands from within a Workbook. This brings a much-enhanced experience, thanks to the evolution of language processing from classic NLP models to LLMs. The newer feature is the AI Assistant, and while it was earlier only available to larger OAC deployments, with the May 2025 update, it has now been made available to all OAC instances!

If you’re considering a solution that leverages Gen AI for data analytics, the AI Assistant is a good fit for enterprise-wide deployments. I will explain why.

  • Leverages an enterprise semantic layer: What I like most about how AI Assistant works is that it reuses the same data model and metadata that are already in place and caters for various types of reporting and analytical needs. AI Assistant adds another channel for user interaction with data, without the risks of data and metadata redundancy. As a result, no matter whether creating reports manually or leveraging AI, everyone across the organization remains consistent in using the same KPI definitions, the same entity relationships and the same dimensional rollup structures for reporting.
  • Data Governance: This is along the same lines as my first point, but I want to stress the importance of controls when it comes to bringing the power of LLMs to data. There are many ways of leveraging Gen AI with data and some are native to the data management platforms themselves. However, implementing Gen AI data querying solutions directly within the data layer requires a closer look at security aspects of the implementation. Who will be able to get answers on certain topics? And if the topic is applicable to the one asking, how much information are they allowed to know?

The AI Assistant simply follows the same object and row level security controls that are enforced by the semantic data model.

  • What about agility? Yes, governed analytics is very important. But how can people innovate and explore more effective solutions to business challenges without the ability to interact with the data that comes along with these challenges. The AI Assistant works not only with the common enterprise data model, but with individually prepared data sets as well. As a result, the same AI interface caters to questions asked about both enterprise data as well as departmental or individualized data sets.
  • Tunability and Flexibility: Enabling the AI Assistant for organizational data, while relatively an easy task, does allow for a tailored setup. The purpose of tuning the setup is to increase the levels of reliability and accuracy. The flexibility comes into play when directing the LLM on what information to take into consideration when generating responses. And this can be done through a fine-tuning mechanism of designating which data entities and/or fields of data within these entities, can be considered.
  • Support for data indexing, in addition to metadata: When tuning the AI Assistant setup, three options are available to pick from, down to the field level: Don’t Index, Index Metadata Only, and Index. With the Index option, we can include information about the actual data in a particular field so the AI Assistant is aware of that information. This can be useful, for example, for a Project Type field so the LLM is informed of the various possible values for Project Type. Consequently, the AI Assistant provides more relevant responses to questions that include specific project types as part of the prompt.
  • Which LLM to use? LLMs continue to evolve, and it seems that there will always be a better, more efficient and more accurate LLM to switch to. Oracle has made the setup for the AI Assistant open, to an extent, in that it can accommodate external LLMs, besides the built-in LLM that is deployed and managed by Oracle. At this time, if not using the built-in LLM, we have the option of using an Open AI model via the Open AI API. Why may you want to use the built-in LLM vs an Open AI model?
    • The embedded LLM is focused on the analytical data that is part of your environment. So it’s more accurate in that it is less prone to hallucinations. However, this approach doesn’t provide flexibility in terms of access to external knowledge.
    • External LLMs include public knowledge (depending on what knowledge an LLM is trained on) in addition to the analytical data that is specific to your environment. This normally allows AI Assistant to have better responses when the questions asked are broad and require public knowledge to tie into the specific data elements housed in one system. Think for example about geographical facts, statistics, weather, business corporations’ information, etc. These are public information and can help in responding to analytical questions within the context of an organization’s data.
    • If the intent is to use an LLM but avoid the inclusion of external knowledge when generating responses, there is the option to restrict the LLM so it limits responses based on organizational data only. This approach leverages the reasoning capabilities of models without compromising the source of information for the responses.
  • The Human Factor: AI Assistant factors in the human aspect of leveraging LLMs for analytics. Having a conversation with data through natural language is to the most part straight forward when dealing with less complex data sets. This is because, in the case, the responses are more deterministic. As the data model gets more complex, there will be more opportunities for misunderstanding and missed connections between what’s on one’s mind versus an AI generated response, let alone a visual one. This is why the AI Assistant has the capability for an end user to adjust the responses to better align with their preferences, without reiterating prompts and elongated back and forth conversations. These adjustments can be easily applied with button clicks, for example to change a visual appearance or change/add a filter or column, all within a chat window. And whatever visualizations the AI Assistant produces, can be added to a dashboard for further adjustments and future reference.

In the next post, I will mention a few things to watch out for when implementing AI Assistant. I will also demo what it looks like to use AI Assistant for project management.

]]>
https://blogs.perficient.com/2025/05/09/a-closer-look-at-the-ai-assistant-of-oracle-analytics/feed/ 0 381155
G6 Hospitality Named as a 2024 Application Modernization Award Finalist https://blogs.perficient.com/2024/08/01/g6-hospitality-named-as-a-2024-application-modernization-award-finalist/ https://blogs.perficient.com/2024/08/01/g6-hospitality-named-as-a-2024-application-modernization-award-finalist/#respond Thu, 01 Aug 2024 22:11:46 +0000 https://blogs.perficient.com/?p=366803

Oracle recently announced the finalists and winners of its 2024 Oracle Excellence Awards. We’re delighted to share that G6 Hospitality was recognized as a finalist for the Application Modernization Award.Finalist G6 Excellenceawards

This honor positions G6 Hospitality among a field of top Oracle customers who have demonstrated excellence in application modernization with Oracle Analytics, Oracle Autonomous Data Warehouse, and Oracle Cloud Infrastructure. This acknowledgment not only reflects our commitment to innovation and excellence but also validates the hard work and dedication of G6 and the Perficient team.

“We are thrilled by this recognition from Oracle as a 2024 Application Modernization Award finalist,” said Brijesh Ravindran, vice president of IT engineering and architecture, G6 Hospitality. With the infrastructure and foundation in place for a robust analytics platform, G6 Hospitality is now well-positioned to tap into the latest advancements in data intelligence solutions, including accelerated adoption of data democratization, improved data visualization, expanded third-party data, and leveraging AI solutions in conjunction with enterprise performance reporting.

“This is a huge acknowledgment for G6 from Oracle, and we are excited to be one of the finalists,” according to Deepa Unni, Director of IT Development, G6 Hospitality. “As a part of G6’s Application Modernization strategy, this initiative of adopting Oracle Cloud allows us to enable, optimize, and enhance new solutions and services to support our business teams.”

Awards were classified into several categories for infrastructure, including Data Center Migration and Transformation, Application Modernization, Data Platform Innovation, AI Innovation, and Cloud Architect with up to five finalists per category. Honorees in the Application Modernization category have successfully moved Oracle and non-Oracle applications. According to Oracle, “…sometimes these deployments minimize rearchitecting applications, while in other cases they involve modernizing the application to new software versions or cloud services. They deliver higher performance at lower cost than alternatives.”

Recognition from Oracle for Application Modernization Revolutionizing the Hospitality Industry

“Perficient’s dedicated best-in-class team worked hand-in-hand with G6 at every step, making this a remarkable accomplishment,” said Deepa Unni.

The Perficient ERP team also helped G6 upgrade EBS Financials from R12.1.3 to R12.2.12 and consolidate several custom back-office applications.

“Perficient brought in their ‘A’ team from their CoE of DBAs during a critical upgrade challenge and resolved the issue timely and efficiently, completing the project under budget and on time, overcoming numerous challenges along the way,” said Brijesh Ravindran. “Perficient’s unwavering commitment to understanding G6’s unique needs allowed us to create tailor-made solutions that optimized their application modernization initiatives, delivered successful outcomes, and drove growth through enhanced end-user experience.”

]]>
https://blogs.perficient.com/2024/08/01/g6-hospitality-named-as-a-2024-application-modernization-award-finalist/feed/ 0 366803
Data Virtualization with Oracle Enterprise Semantic Models https://blogs.perficient.com/2024/02/22/data-virtualization-with-oracle-enterprise-semantic-models/ https://blogs.perficient.com/2024/02/22/data-virtualization-with-oracle-enterprise-semantic-models/#respond Thu, 22 Feb 2024 22:51:57 +0000 https://blogs.perficient.com/?p=357386

A common symptom of organizations operating at suboptimal performance is when there is a prevalent challenge of dealing with data fragmentation. The fact that enterprise data is siloed within disparate business and operational systems is not the crux to resolve, since there will always be multiple systems. In fact, businesses must adapt to an ever-growing need for additional data sources. However, with this comes the challenge of mashing up data across systems to provide a holistic view of the business. This is the case for example for a customer 360 view that provides insight into all aspects of customer interactions, no matter where that information comes from, or whether it’s financial, operational or customer experience related. In addition, data movements are complex and costly. Organizations need the agility to adapt quickly to the additional sources, while maintaining a unified business view.

Data Virtualization As a Key Component Of a Data Fabric

That’s where the concept of data virtualization provides an adequate solution. Data stays where it is, but we report on it as if it’s stored together. This concept plays a key role in a data fabric architecture which aims at isolating the complexity of data management and minimizing disruption for data consumers. Besides data-intensive activities such as data storage management and data transformation, a robust data fabric requires a data virtualization layer as a sole interfacing logical layer that integrates all enterprise data across various source applications. While complex data management activities may be decentralized across various cloud and on-premises systems maintained by various teams, the virtual layer provides a centralized metadata layer with well-defined governance and security.

How Does This Relate To a Data Mesh?

What I’m describing here is also compatible with a data mesh approach whereby a central IT team is supplemented with products owners of diverse data assets that relate to various business domains.  It’s referred to as the hub-and-spoke model where business domain owners are the spokes, but the data platforms and standards are maintained by a central IT hub team. Again, the data mesh decentralizes data assets across different subject matter experts but centralizes enterprise analytics standards. Typically, a data mesh is applicable for large scale enterprises with several teams working on different data assets. In this case, an advanced common enterprise semantic layer is needed to support collaboration among the different teams while maintaining segregated ownerships. For example, common dimensions are shared across all product owners allowing them to report on the company’s master data such as product hierarchies and organization rollups. But the various product owners are responsible for consuming these common dimensions and providing appropriate linkages within their domain-specific data assets, such as financial transactions or customer support requests.

Oracle Analytics for Data Virtualization

Data Virtualization is achieved with the Oracle Analytics Enterprise Semantic Model. Both the Cloud version, Oracle Analytics Cloud (OAC) and the on-premises version, Oracle Analytics Server (OAS), enable the deployment of the semantic model. The semantic model virtualizes underlying data stores to simplify data access by consumers. In addition, it defines metadata for linkages across the data sources and enterprise standards such as common dimensions, KPIs and attribute/metric definitions. Below is a schematic of how the Oracle semantic model works with its three layers.

Oracle Enterprise Semantic Model

Outcomes of Implementing the Oracle Semantic Model

Whether you have a focused data intelligence initiative or a wide-scale program covering multi-cloud and on-premises data sources, the common semantic model has benefits in all cases, for both business and IT.

  • Enhanced Business Experience

With Oracle data virtualization, business users tap into a single source of truth for their enterprise data. The information available out of the Presentation Layer is trusted and is reported on reliably, no matter what front end reporting tool is used: such as self-service data visualization, dashboards, MS Excel, Machine Learning prediction models, Generative AI, or MS Power BI.

Another value-add for the business is that they can access new data sources quicker and in real-time now that the semantic layer requires no data movement or replication. IT can leverage the semantic model to provide this access to the business quickly and cost-effectively.

  • Future Proof Investment

The three layers that constitute the Oracle semantic model provide an abstraction of source systems from the presentation layer accessible by data consumers. Consequently, as source systems undergo modernization initiatives, such as cloud migrations, upgrades and even replacement with totally new systems, data consuming artifacts, such as dashboards, alerts, and AI models remain unaffected. This is a great way for IT to ensure any analytics investment’s lifespan is prolonged beyond any source system.

  • Enterprise Level Standardization

The semantic model enables IT to enforce governance when it comes to enterprise data shared across several departments and entities within an organization. In addition, very fine-grained object and data levels security configurations are applied to cater for varying levels of access and different types of analytics personas.

Connect with us for consultation on your data intelligence and business analytics initiatives.

]]>
https://blogs.perficient.com/2024/02/22/data-virtualization-with-oracle-enterprise-semantic-models/feed/ 0 357386
Best Practices for Oracle Fusion HCM Analytics https://blogs.perficient.com/2024/02/14/best-practices-for-oracle-fusion-hcm-analytics/ https://blogs.perficient.com/2024/02/14/best-practices-for-oracle-fusion-hcm-analytics/#comments Wed, 14 Feb 2024 18:45:55 +0000 https://blogs.perficient.com/?p=356823

Oracle Fusion HCM Analytics, a part of Oracle Fusion Data Intelligence Platform (DIP) (earlier known as Fusion Analytics Warehouse), equips various management levels with deep insights to effectively manage the workforce across the organization. DIP is to the most part a ready-to-use data and analytics solution that is typically implemented in a matter of weeks. There are some key considerations though to ensure a successful rollout, not just for the baseline (out of the box) content, but also for ensuring reliability and extensibility to accommodate ongoing and evolving analytics needs. In this blog I will highlight key points to plan for on your journey to rolling out Oracle Fusion HCM Analytics.

  1. What data to include in DIP? To answer this question, we go through the functional areas that need to be enabled. Identify the HCM modules that are in use and enable corresponding DIP functional areas (such as Workforce Management, Absence Management, Talent Acquisition, etc.). Another aspect of answering this question is deciding how much historical HCM data to include in DIP for analytics. It is also important to take into consideration when Oracle HCM was first implemented and how comprehensive the conversion effort was from legacy HCM systems. If there were acquisitions over the years, it is also good to understand if there have been data conversion anomalies that introduce exceptions with respect to workforce data setups for the rest of the organization. The end goal here is to understand how far back we want to go in terms of loading historical data while maintaining data quality and having enough historical trends to generate realistic forecasts (such as hiring and attrition projections).
  2. Data Corrections in Oracle Fusion HCM: Running the data pipeline from Oracle HCM to the DIP data warehouse will result in rejected records if there are data inconsistencies within Oracle HCM. To minimize the chances of these record rejections, it is recommended to run Oracle HCM diagnostic tests to identify and correct person, manager hierarchy, and legislative information records, prior to loading DIP.
  3. Investigate Rejected Records: The data pipeline from Oracle HCM to the DIP data warehouse is managed by Oracle and therefore requires little intervention. This is a huge differentiator compared to the massive effort of going the route of a home-grown data and analytics solution. These data pipelines enforce data quality checks as they load data in DIP and will therefore reject and flag data issues that need to be addressed. So it’s important to plan to go through the list of rejected records with their reasons, perform any necessary corrections in Oracle HCM and re-run incremental data refresh in DIP.
  4. Run Data Validations: DIP has built-in capability to compare its KPIs to metric calculations sourced directly from native Oracle HCM OTBI reporting subject areas. This helps understand any variances in metric formulas that may need to be adjusted with custom calculations that best fit your organization.
  5. Dashboard Content Organization: You can set up your own company specific catalog of standard dashboards before rolling them out to various user groups. This can be done by copying over content from the Oracle provided list of workbooks into a new custom folder to validate and publish to various groups. The default Oracle shipped folder is locked down and can’t be edited but it can be used to copy content from and edit within a custom folder.
  6. Plan for Various Scenarios of Implementing DIP Security: A major part of a DIP implementation is having a detailed plan of implementing HCM analytics security to accommodate various roles and responsibilities within HR and across the enterprise. DIP offers various layers of implementing analytics security, therefore it’s crucial to lay out the different types of Application Roles and plan on using them to handle the following security aspects:
    • Who has access to create new analytics content and who has view only access?
    • Which Oracle HCM roles have access to each workbook or group of workbooks? It is recommended to group workbooks with similar access criteria within the same folder therefore setting application role permissions at the folder level.
    • Which Oracle HCM roles require data security? Map these roles to the corresponding data security role. If out of the box data security roles don’t achieve the required outcome, plan on implementing custom security configurations to achieve your goal.
    • Are there groups of users who should not be able to drill down to detail information (such as individual information) but still be able to report at a summary level? A custom security configuration will be needed in this case.
  7. Data Security for Line Managers: Manager hierarchy security is supported out of the box by leveraging the default Line Manager Data Security Application Role. However, this role can’t be combined with other customer data security roles. Therefore, in this situation it is required to configure a custom manager data security application role which may be combined with other custom data security roles to achieve the desired data security model.
  8. Data Security Assignments: Data Security assignments are supported for various dimensions such as Business Unit, Department, Country, Legal Entity and Self-Record. However, the process to assign data security assignments is manual in the DIP Security console whereby the assignments are done to individual users. While there is a file upload process that enables setting up these data security assignments in bulk, it will require re-upload of assignments on a regular basis whenever there is a need to update assignments. This can be a significant maintenance effort, but luckily it is possible to automate the process of performing data security assignments and therefore eliminating any security risks.
  9. Leverage Benchmarks: Providing line managers visibility into how their own teams compare to the company wide averages allows them to identify areas for improvement. This enables line managers to see their own team’s performance in comparison to other parts of the company. Some examples of doing this include tracking KPIs around diversity, turnover, promotions, hires, and their corresponding trends over time. Since typically we don’t want to open up access to detail level of information for all line managers, we go through a configuration process to enable benchmark reporting at an aggregate level without jeopardizing access to secured information.
  10. Compensation Reporting with Element Entries: If you are looking to provide a full picture of employee compensation and benefits with information sourced directly from your Oracle HCM Element Entries, it is possible to extend DIP with a custom subject area to do so. You will then be able to pull in all historical information that relate to compensation and benefits down to the most granular level and roll it up together with the rest of the salary analysis information.
  11. Support for Custom Fields: Each organization has their own list of descriptive flexfields that have been configured in Oracle Fusion HCM. To allow for a consistent and standardized experience of self-service analytics in DIP, these descriptive flexfields will need to be enabled in DIP and incorporated in the various subject areas that are applicable.
  12. Report on Dynamic Time Periods: It is straightforward to report on fixed time periods such as by month, quarter and year. However, there is often the ask to enhance dashboards by enabling dynamic filtering on a sliding time window such as year-to-date, quarter to date or evening a rolling 12-month duration. In-dashboard filtering can be added to enable such variable time windows without generating multiple versions of the same reports.

For help with implementing Oracle Fusion HCM Analytics or other DIP products, contact Mazen Manasseh at Perficient.

]]>
https://blogs.perficient.com/2024/02/14/best-practices-for-oracle-fusion-hcm-analytics/feed/ 1 356823
Join Us at Kscope23! https://blogs.perficient.com/2023/06/12/join-us-at-kscope-23/ https://blogs.perficient.com/2023/06/12/join-us-at-kscope-23/#respond Mon, 12 Jun 2023 14:52:57 +0000 https://blogs.perficient.com/?p=337335

Kscope23, ODTUG’s user conference known for providing a rich opportunity for networking and education on Oracle technologies, is taking place June 25 – 29 in Aurora, CO. ODTUG is famous for great technical content including Oracle symposiums, hands-on labs, and 250+ technical sessions.

Perficient is proud sponsor of Kscope23. Visit us at booth #519 to meet with subject matter experts and thought leaders and learn how we’ve leveraged our extensive expertise in Enterprise Performance Management (EPM), Enterprise Resource Planning (ERP), supply chain management (SCM), human capital management (HCM), and analytics to drive digital transformation for our customers. Unlike boutique firms that specialize in one or two offerings, our investment in and commitment to our Oracle partnership is extensive, with 15 Oracle specializations, integrated IP assets, and best practices gained from implementing Oracle solutions within our own company. We have delivered strategy and implementation for on-premises, cloud, and hybrid solutions to meet the unique needs of our clients. We also offer a post-implementation managed services offering.

There will be eight speaking sessions at this year’s conference, on topics ranging from enterprise planning to financial close! Among Perficient’s sessions is a case study presentation with Boardwalk Pipelines.  

Bring Your Own Machine Learning Model…Why Would Anyone Want to Do That?
Tuesday, June 27th | Track: Enterprise Planning | 8:30 – 9:30

This session will closely examine how Finance can take advantage of Machine Learning (ML) models in Oracle EPM. We will tackle the stigma that Machine Learning (ML) models are too technical and are outside of the skillset of Financial Planning and Analysis professionals.

Is that an Essbase Administrator App in Your Pocket?
Tuesday, June 27th | Track: Essbase | 9:45 – 10:45

In this session learn what a no-cost Oracle product called Oracle APEX is all about and how to use it to integrate common information and tasks from your Essbase systems into a secure application that can be used everywhere from your PC to your tablet to your favorite mobile device.

Case Study: Better Performance with DSO in Cloud Consolidations at Boardwalk Pipeline
Tuesday, June 27th | Track: Financial Close | 1:45 – 2:45

Are you loving your Oracle Financial Consolidation and Close application but think the performance could be better? Hear Boardwalk Pipeline’s successful DSO conversion story.

What the Heck is EPM Anyway? Connecting APEX to Another World…
Tuesday, June 27th | Track: APEX | 1:45 – 2:45

This session is meant to be an introduction to the Oracle EPM product suite for the APEX practitioner. We will provide and discuss the use cases for building an APEX application for the EPM Cloud and Essbase communities. Think of it as EPM 101 for APEX.

Are Your Spreadsheets Costing You Time and Money?
Tuesday, June 27th | Track: Strategic EPM | 3:15 – 4:15

It’s been said many times that the number one tool for finance and accounting is Microsoft Excel. But it may prove both time-consuming and error prone. Oracle EPM can integrate disparate data into a single source of truth that provides the data needed to drive successful business outcomes across the organization.

Learn how Allegis Saved Time and Money by Migrating from On Premises to the Cloud for Planning, Consolidation, Master Data and Tax
Tuesday, June 27th | Track: Strategic EPM | 4:30 – 5:30

Allegis Group is a large, multi-national talent management firm with over 19,000 employees and $13.4 billion in revenue. They were operating on-premises versions of Hyperion Planning, HFM, DRM and HTP when they began a migration to move all of these applications to the Oracle Cloud. In this session, we will discuss the approach to move to the cloud as well as the plan to support ongoing operations during the migration.

Use Oracle APEX to Unify Information from Consolidation, Reconciliation, and other EPM Cloud 
Wednesday, June 28th | Track: Financial Close | 2:00 – 3:00

In this session learn what Oracle APEX is all about and how to use it to integrate common information and tasks from your EPM systems into a secure application that can be used everywhere from your PC to your tablet to your favorite mobile device.

Hybrid Essbase Implementation
Wednesday, June 28th | Track: Essbase | 3:15 – 4:15

Have you heard of Hybrid BSO implementation? It is one of those new shiny Essbase functionalities that usually do not go past the sales PowerPoint presentations. Perficient will demonstrate that converting a classic BSO application into a Hybrid BSO application provides real and tangible benefits to you and your company.

If you’re not able to attend Kscope in-person, but would like to learn more about any of our topics, please reach out to us.

]]>
https://blogs.perficient.com/2023/06/12/join-us-at-kscope-23/feed/ 0 337335
Let’s Meet at Ascend Next Week! https://blogs.perficient.com/2023/06/05/lets-meet-at-ascend-next-week/ https://blogs.perficient.com/2023/06/05/lets-meet-at-ascend-next-week/#respond Mon, 05 Jun 2023 21:59:51 +0000 https://blogs.perficient.com/?p=337015

AscendWe’re Headed to Ascend

Perficient is headed to sunny Orlando, FL to attend Ascend which is being held June 11-14 at the Caribe Royale Resort. Did you know that Ascend is where the Oracle user community comes together to share and learn through hundreds of education sessions, panels, and networking events?

If you are headed to the show, you’ll have an opportunity to meet with Crystal Fernandez, director Oracle ERP, and Moises Gonzalez, portfolio specialist, and learn how we’ve leveraged our extensive expertise in ERP, EPM, HCM, SCM, and Analytics to drive digital transformation for our customers.  Let’s meet to discuss your journey to the cloud or thoughts on how to optimize your on-premises environment.

Meet Crystal and Moises at the Show

As the driving force behind our managed services practice, Crystal said she’s really looking forward to engaging with our clients in person at the show. “There’s nothing quite like sitting down with a customer over a cup of coffee and discussing how things are going, what challenges they are facing, and how can Perficient help.” Crystal says that unexpected turnover and the burden it puts on staff that already has a full plate is a common hurdle, but its not insurmountable.  Her goal is to ensure that all of Perficient’s current and future customers have the support they need so that bumps in the road do not become a hindrance to the business. Connect with Crystal via the show app, she’d be happy to meet up at the Brew Pub Hub. She’s a good listener and I think you’d find her advice worth your time.

I caught up with Moises and asked him why he’s excited to attend Ascend this year and his answer was short and to the point, “Networking with the broad Oracle user community. Where else can you meet Oracle users from all across the globe that come together in one place to share and learn from each other?” Moises is looking forward to attending sessions, meeting with like-minded colleagues, and exploring the partner pavilion.  Moises shared three of the sessions he plans to attend: Oracle Cloud HCM and ERP – Better Together, What is the Correct Migration Path From Oracle On-Premise ERP to Oracle Cloud ERP?, and Oracle Cloud EPM Strategy and Roadmap. 

Perficient, an Oracle Partner serving clients for 20 years, provides its clients digital experience, business optimization, and industry solutions and support. We’re committed to partnering with our clients to tackle complex business challenges and accelerate transformative growth.

If you are unable to attend but would like to meet after the show, please leave a comment and we’ll reach out. You can also connect with us via LinkedIn.

]]>
https://blogs.perficient.com/2023/06/05/lets-meet-at-ascend-next-week/feed/ 0 337015
Why Implement Incorta Analytics for Oracle Fusion Cloud ERP Reporting? https://blogs.perficient.com/2022/05/10/why-implement-incorta-analytics-for-oracle-fusion-cloud-erp-reporting/ https://blogs.perficient.com/2022/05/10/why-implement-incorta-analytics-for-oracle-fusion-cloud-erp-reporting/#respond Tue, 10 May 2022 20:50:02 +0000 https://blogs.perficient.com/?p=309457

Oracle Cloud ERP offers several built-in tools for reporting. While native reporting tools like Oracle Transactional Business Intelligence (OTBI) and Oracle BI Publisher are well-suited for specific types of operational reporting, they do have limitations when it comes to performing complex and enterprise-wide reporting. It is therefore crucial to complement the Oracle Cloud ERP application with an enterprise reporting solution.

A major consideration to keep in mind is that Oracle Cloud ERP is a SaaS application. Unlike Oracle E-Business Suite (EBS), direct access to the Oracle Cloud ERP database (OLTP) is typically restricted. Therefore, traditional approaches to ERP reporting that may have worked well with EBS, do not fit very well with Oracle Fusion SaaS applications. For example, you may have done EBS reporting with Noetix for IBM Cognos, OBIEE, Discoverer or other legacy BI tools. Or you may have several ETL processes that extracted, transformed, and loaded on-premises ERP data into a data warehouse. However, following a similar approach for Cloud ERP reporting is not ideal. The recommendation is to have the ERP Cloud implementation accompanied by a more innovative reporting methodology that fits well with the modernity of the Cloud ERP application, is scalable to perform adequately, and offers timely time to value when it comes to addressing continuously evolving business needs for analytical insights. In this blog, I will describe how Oracle Cloud ERP is supplemented with Incorta, an innovative data and reporting platform that transcends common challenges of the classical approach of the data warehouse.

What Differentiates Incorta Analytics for Oracle Cloud ERP?

Severals factors come into play when deciding on which type of reporting solution works best with the applications at hand. Here I am presenting Incorta as a very viable option for its capabilities in handling data and reporting features. Out of many reasons why, I am focusing here on the three I believe are most relevant with Oracle Cloud ERP.

  1. Expedited Deployment & Enhancements

Deploying Incorta for Oracle Cloud ERP follows a much faster cycle than implementing traditional data warehouse type deployments. Even after the initial deployment, rolling out additional reporting enhancements on Incorta follows a faster time to value due to several reasons:

    • Direct Data Mapping: While conventional data warehouses require extensive data transformation, Incorta leverages data structures out of Oracle Cloud ERP in their original form. Consequently, Incorta replaces ETL processing, star schemas and data transformations, with a Direct Data Mapping technology. With Direct Data Mapping, the Incorta approach maintains source application data models in their original form, with minimal transformation. Consequently, we end up with a one to one mapping to the corresponding data objects and relationships in Oracle Cloud ERP. Traditionally this didn’t work well for reporting due to a significant impact to querying performance. However, the innovation introduced with Incorta Direct Data Mapping enables high performing batch queries on massive amounts of data, without requiring extensive ETL transformation, as was previously the case with a data warehouse. Eliminating the overhead involved in doing extensive data transformation is at the root of why Incorta offers a more expedited path to initially implementing and regularly enhancing Incorta analytics.
    • Oracle Cloud Applications Connector: Unlike on-premises ERP applications, direct database access is not available, in a scalable manner, from Oracle Fusion Applications. Doing a reporting solution on Oracle Cloud ERP involves a major undertaking related to the initial setup, scheduling and ongoing refreshes of data extracts of hundreds of data objects typically used for ERP reporting. You may be thinking that using tools like Oracle BI Publisher or OTBI may be a way to go about getting the Cloud ERP data you need for reporting. While such a technique may get you going initially, it’s not a feasible approach to maintain data extracts out of Cloud ERP because it jeopardizes the performance of the Oracle Cloud ERP application itself, the integrity of the reporting data and its completeness, and the ability to scale to cover more data objects for reporting.

The whole data export process is however streamlined and managed from within Incorta. A built-in connector to Oracle Fusion applications allows Incorta to tap into any data object in Oracle Cloud ERP. The connector performs data discovery on Oracle Cloud ERP View Objects (VO), reads the metadata and date available in both Oracle VOs and custom VOs, and loads data into Incorta. The connector adheres to Oracle best practices for exporting Oracle Fusion data in bulk. The connectivity happens through the Oracle Fusion Business Intelligence Cloud Connector (BICC). There is no need to develop the BICC data exports from scratch as the Incorta Blueprint for Oracle Cloud ERP already includes pre-defined BICC offerings for various ERP functional areas (such AP, AR, GL, Fixed Assets, etc.). These offerings are available to import into BICC, with the option of updating with custom View Objects. Managing the data load from Oracle Cloud ERP into Incorta takes place from the Incorta web UI and therefore requires minimal setup on the Oracle Fusion side.

We can schedule multiple pre-configured offerings from the Incorta blueprint, depending on which modules are of interest to enable in Incorta for Oracle Cloud ERP reporting. This matrix provides a list of BICC offerings that get scheduled to support different functional areas of interest.

    • Pre-built Dashboards and Data Models for Oracle Cloud ERP: Time to value with Incorta is significantly shorter compared to doing analytics on other platforms because Incorta has a ready-to-use data pipeline, data model and pre-built dashboards specifically for Oracle Cloud ERP. The ready-to-use Cloud ERP blueprint also incorporates business schemas that enable power users to self-serve their needs for creating their own reports. The Incorta Oracle Cloud ERP blueprint includes pre-built dashboards for:
      • Financials: General Ledger, Accounts Payable, Employee Expenses, Accounts Receivable, Fixed Assets, and Projects
      • Supply Chain: Procurement and Spend, Order Management and Inventory
      • Human Capital Management: Workforce, Compensation, Absense and Payroll

In addition, pre-built dashboards include reporting on common business functions such as: Procure to Pay, Order to Cash, Bookings, Billings and Backlog.

  1. High Performing and Scalable to Handle Billions of Rows

If you are familiar with data warehouses and BI solutions, you are probably aware that the performance of a reporting solution is key to its success. And performance here includes both the data layer, whereby data refreshes happen in a timely manner, as well as front-end reporting response times. If the business is unable to get the information required to drive decisions in a timely manner, the reporting platform would have failed its purpose. Therefore, laying a solid foundation for an enterprise-wide reporting solution must have performance and scalability as a key criterion.

What I like about Incorta is that it is not only a data visualization or reporting platform, but it is a scalable data storage and optimized data querying engine as well. With Incorta we don’t need to setup a 3rd party database (data warehouse) to store the data. Incorta handles the storage and retrieval of data using data maps that offer very quick response times. Previously, with a data warehouse, when a table (like GL journals or sales invoices, for example) starts growing above a few million rows, you would need to consider performance optimization through several techniques like archiving, partitioning, indexing, and even adding several layers of aggregation to enhance reporting performance. All these activities are time consuming and hinders productivity and innovation. These traditional concepts for performance optimization are not needed anymore as Incorta is able to easily handle hundreds of millions and billions of rows without the need to intervene with additional levels of aggregate tables.

  1. Support for Multiple Data Source Applications

It is often the case that analytics encompasses information from multiple applications, not just Oracle Cloud ERP. A couple things to consider in this regard:

    • Multiple ERP Applications: The migration of an on-premises ERP application to Oracle Cloud ERP may not necessarily be a single-phased project. The migration process may very well consist of multiple sequential phases, based on different application modules (GL, AP, AR, Projects, Procurement, etc.) or based on staggered migrations for different entities within the same organization. Consequently, it is often the case that the ERP reporting solution needs to simultaneously support reporting from other ERP applications besides Oracle Cloud ERP. A typical use case is to source GL data from Oracle Cloud ERP while sub-ledger data is sourced from an on-premises application like EBS. Another common use case is to combine data for the same ERP module from both Oracle Cloud ERP and EBS. Incorta allows for multiple schemas to be mapped to and loaded from various applications, besides Oracle Cloud ERP. Incorta then handles the union of data sets from multiple schemas to be reported against seamlessly within the same report.
    • Cross-functional Reporting: Along the same lines, there is often a need to report on ERP data in conjunction with data external to ERP, such as from a Sales, Planning, Marketing, Service, or other applications. With a rich list of supported connectors and accelerator blueprints for various source applications, Incorta can connect to and establish separate schemas for each of the applications of interest. Data objects loaded into Incorta can then be joined across schemas and mapped appropriately to enable reporting on information from various source systems.

If you’re on your journey to Oracle Cloud ERP and wondering what to do with your legacy data warehouse and reporting platforms, I encourage you to reach out for a consultation on this. The Perficient BI team is highly experienced with ERP projects and has helped many customers with their upgrades and analytics initiatives leveraging a diverse set of technology vendors and platforms.

]]>
https://blogs.perficient.com/2022/05/10/why-implement-incorta-analytics-for-oracle-fusion-cloud-erp-reporting/feed/ 0 309457
Save the Date for Kscope22 https://blogs.perficient.com/2022/04/06/save-the-date-for-kscope22/ https://blogs.perficient.com/2022/04/06/save-the-date-for-kscope22/#respond Wed, 06 Apr 2022 18:53:33 +0000 https://blogs.perficient.com/?p=306478

Kscope22, ODTUG’s user conference known for providing a rich opportunity for networking and education on Oracle technologies, is taking place June 19 – 23 in Grapevine, TX. Why Attend? If great technical content that ODTUG is famous for including Oracle symposiums, hands-on labs, and 200+ technical sessions isn’t enough, ODTUG has crafted justification letters to help you make your case!

Perficient is proud to be a silver sponsor of Kscope22. Visit us at booth #315 to meet with subject matter experts and thought leaders and learn how we’ve leveraged our extensive expertise in Enterprise Performance Management (EPM), and Data and Analytics, to drive digital transformation for our customers.

For two decades we have provided Oracle solutions expertise in EPM, Enterprise Resource Planning (ERP), supply chain management (SCM), human capital management (HCM), and analytics. Unlike boutique firms that specialize in one or two offerings, our investment in and commitment to our Oracle partnership is extensive, with 15 Oracle specializations, integrated IP assets, and best practices gained from implementing Oracle solutions within our own company. We have delivered strategy and implementation for on-premises, cloud, and hybrid solutions to meet the unique needs of our clients. Perficient owns and operates an education center in Houston and delivers and resells Oracle EPM training in partnership with Keyteach. We also offer a post-implementation managed services offering.

We have three speakers at this year’s conference, on topics ranging from enterprise planning to financial close!

Data Science + Machine Learning Accelerating Financial Planning and Analysis
Tuesday, June 21st | Session ID 5964 | Track: Enterprise Planning | 2:30 – 3:30

Data science? Machine learning? Most people associate those terms with the latest in artificial intelligence, or robotics…but finance?

Oracle EPM Cloud has introduced a new set of breakthrough capabilities named Intelligent Performance Management (IPM) that uses data science, machine learning, and pattern recognition technology to make data analysis faster and more intelligent.

In this session we’ll discuss the primary types of “insights” and how they can be effectively leveraged in financial planning and analysis. We’ll then use this as a setup for a brief demonstration of these features in the application. We’ll walk through a standard budgeting and forecasting cycle and show the attendees where the Insight Discovery process can add tremendous value to your data analysis efforts. Lastly, we’ll wrap up with a few implementation considerations, so you can leverage this exciting new technology in your organization.

Oracle Enterprise Planning Management Case Study for Implementing Enterprise-wide Application for a Multi-pillar Academic Healthcare Organization
Wednesday, June 22nd | Session ID 6083 | Track: Strategic EPM | 10:15 – 11:15

Can Oracle EPM be used to transform a large health organization to provide improved data agility and reporting and analytics, improved efficiency in financial planning? Absolutely! Come one, come all and hear about how Perficient did just that for a large nonprofit academic medical center focused on healthcare, education, and research with over $13 billion in revenue.

In this session we will cover the project’s phased approach for building an annual plan, strategic long range plan, and annual forecast for different lines of business – practice shield, education and research with multiple EPBCS applications.

We will cover all project strategy, driver-based and non-driver-based planning, out-of-box financials and Workforce modules, flexibility to choose varying levels of planning and multiple applications and inter application integrations.

We will cover varying levels of planning between different business areas of the organization, target setting and allocations of shared services expenses across different areas using PCMCS. We will also discuss future enhancements that we envision to bring this enterprise-wide planning solution.

How to Minimize Consolidation Times for Large Data Sets in Oracle Cloud Consolidations 
Wednesday, June 22nd | Session ID 6036 | Track: Financial Close | 10:15 – 11:15

Allegis Group is a large, multi-national talent management firm with over 19,000 employees and $13.4 billion in revenue. Allegis had a robust, mature on-premises HFM environment, but wanted to migrate to the Cloud. Allegis migrated from HFM to Oracle Cloud Consolidations (FCC). During the course of the migration, it became apparent that the large Allegis dataset was not consolidating in the cloud as fast as it was in the on-premises application and the consolidation duration exceeded the expectations of the business.

Learn about the steps that were taken to try to improve consolidation times including application design considerations, hierarchy considerations, and Oracle application updates. The Allegis consolidation application was called “one of the largest, if not the largest” that some of the Oracle developers had seen. The steps taken for Allegis does scale to other, smaller, Oracle Cloud Consolidation applications and this session would be beneficial to all companies considering or currently implementing this tool. Meet the Allegis Financial Systems Administrator/Solution Architect and implementation partner project manager and hear the real-world challenges and solutions implemented during the course of the project.

If you’re not able to attend Kscope in-person, but would like to learn more about any of our topics, please reach out to us.

]]>
https://blogs.perficient.com/2022/04/06/save-the-date-for-kscope22/feed/ 0 306478
[Webinar Recording] Power Business Insights With Machine Learning and Oracle Analytics https://blogs.perficient.com/2021/11/12/webinar-recording-power-business-insights-with-machine-learning-and-oracle-analytics/ https://blogs.perficient.com/2021/11/12/webinar-recording-power-business-insights-with-machine-learning-and-oracle-analytics/#respond Fri, 12 Nov 2021 22:10:47 +0000 https://blogs.perficient.com/?p=300369

This past week, Mazen Manasseh, business analytics director, Perficient, presented Power Business Insights With Machine Learning and Oracle Analytics.

Forward-thinking organizations are increasingly using machine learning (ML) to increase the level of predictive analytics that they can perform on datasets. The ML capabilities of Oracle Analytics are not only rich, but easy to use with built-in drag-and-drop functions onto visualizations, autonomous prediction execution, and custom-trained ML models.

Click to learn:

  1. How Oracle Analytics can serve as a single tool to connect you to your various data sources
  2. How to apply ML without being statistically savvy
  3. How to easily build your story in presentation format
]]>
https://blogs.perficient.com/2021/11/12/webinar-recording-power-business-insights-with-machine-learning-and-oracle-analytics/feed/ 0 300369
[Webinar] Power Business Insights With Machine Learning and Oracle Analytics https://blogs.perficient.com/2021/10/22/webinar-power-business-insights-with-machine-learning-and-oracle-analytics/ https://blogs.perficient.com/2021/10/22/webinar-power-business-insights-with-machine-learning-and-oracle-analytics/#respond Fri, 22 Oct 2021 15:50:39 +0000 https://blogs.perficient.com/?p=299316

Forward-thinking organizations are increasingly using machine learning (ML) to increase the level of predictive analytics that they can perform on datasets. The ML capabilities of Oracle Analytics are not only rich, but easy to use with built-in drag-and-drop functions onto visualizations, autonomous prediction execution, and custom-trained ML models.

Join us to learn how Oracle Analytics can serve as a single tool to connect you to your various data sources, how to apply ML without being statistically savvy, and how to easily build your story in presentation format.

Come prepared with your questions for Perficient’s team. If you’re unable to attend the live event, all registrants will receive links to the presentation materials and a recording of the on-demand webinar post-event. Register today!

]]>
https://blogs.perficient.com/2021/10/22/webinar-power-business-insights-with-machine-learning-and-oracle-analytics/feed/ 0 299316
Seamless Integration of Oracle Fusion Apps Data with OAC Data Replication https://blogs.perficient.com/2021/09/21/seamless-integration-of-oracle-fusion-apps-data-into-a-data-warehouse-with-oac-data-replication/ https://blogs.perficient.com/2021/09/21/seamless-integration-of-oracle-fusion-apps-data-into-a-data-warehouse-with-oac-data-replication/#respond Tue, 21 Sep 2021 13:33:01 +0000 https://blogs.perficient.com/?p=297447

One of the easiest and lowest maintenance approaches to feed Fusion Cloud Apps data into a data warehouse, is with Oracle Analytics Cloud (OAC) Fusion Business Intelligence Cloud Connector (BICC) Data Replication. Data Replication for Fusion Apps is a native feature of OAC Enterprise edition. If you’re migrating your on-premises application, such as E-Business Suite, to Oracle SaaS, you probably already realize that the capability to directly connect to the Oracle SaaS transaction database for data extraction, isn’t generally available, except for limited use with BI Publisher type reporting. However, Fusion BICC offers a robust approach to enable data extraction from Fusion Apps. BICC extracts data from Fusion App view objects into files stored on Oracle Cloud (OCI). OAC’s data replication from BICC facilitates the process of configuring, scheduling and monitoring the whole process of data extraction from BICC into Cloud storage and then importing the same data into a data warehouse. While this doesn’t really offer an ETL-like functionality, it does streamline the end-to-end process of extracting data from Fusion apps into relational table structures in an Oracle database. These target relational tables can then either be directly reported on or transformed for more complex analytics.

Here are, in my opinion, the key advantages of using OAC Data Replication from Fusion Apps:

  • Oracle Managed: It is a built-in feature of OAC and therefore requires no software install or maintenance. Data Replication jobs are configured, scheduled, and monitored entirely from within an internet browser, within the OAC portal.
  • Supports Extracts of Custom Fusion BICC Offerings and Custom PVOs: If your Oracle SaaS implementation team has setup custom data sets in Fusion, and therefore ended up with custom View Objects, these may also be exported with OAC Data Replication.
  • Filter the Data Extraction: The configuration screens of OAC Data Replication allow for setting up filters that are enforced while the Fusion data is extracted. These filters may be useful to avoid pulling in older data, or to segregate organization-specific information into dedicated target databases.
  • Support for Incremental Loads: The process of setting up an incremental data extract strategy from all the different Fusion data source views is a time-consuming task. However, this is all easy to setup with the configuration screens available in OAC Data Replication.
  • Handles Deletes: While BICC natively doesn’t automatically take care of identifying which records in a source view got deleted, it has a mechanism to identify the primary keys of the views in their current state. But these keys will then have to be compared to the keys in the data warehouse target table, in order to identify any deleted records and therefore process the deletes. This whole process is automatically performed by OAC data replication by checking a box on the OAC configuration screen for a view object.
  • Track Historical Changes: Some Fusion PVOs keep track of changes as certain attributes are updated over time. OAC Data Replication offers an option to maintain these changes in the data warehouse as well. This option enables linking fact data to dimensions that behave in a similar manner to traditional slowly changing dimensions (SCDs).
  • Scheduling of Data Loads: OAC Data Replication allows for scheduling the data extracts and loads from Fusion Apps at various intervals. This may be once a day, but also multiple times throughout the day. In fact, more essential data extracts can be configured to run as frequent as on an hourly basis to offer near real-time reporting, when required on smaller data sets.

While OAC Data Replication for Fusion Apps does offer some great functionality, there are restrictions that may render it unsuitable, depending on how you envision the holistic view of your future state data warehouse. Here are some of the reasons why it may not be adequate:

  • Lack of Data Transformation: Like the name infers, OAC Data Replication, solely offers an easy way to replicate Fusion SaaS data into a data warehouse. It doesn’t really allow for data transformation prior to the data load. Think of it this way: the result of the replication is a populated staging area of a data warehouse. If you need to apply transformation, another technique needs to be used after the Fusion data is replicated over. For instance, OAC Data Replication itself won’t be able to merge and transform the Fusion sourced data based on information sourced from outside Fusion. To do this, first replicate the Fusion data and then integrate with the non-Fusion data as a separate downstream process.
  • Restriction to Load into Oracle Databases: OAC data replication from Fusion Apps only loads data into an Oracle Database or an Oracle Autonomous Database. So, if for example, you have a need to get Fusion data into Azure or another non-Oracle database, you will have to follow a two-step process to first replicate into an Oracle DB and then into your final destination. As a result, if your main destination is non-Oracle, you may want to consider one of the other approaches to extracting data from Fusion Apps, as described in my other blog post.
]]>
https://blogs.perficient.com/2021/09/21/seamless-integration-of-oracle-fusion-apps-data-into-a-data-warehouse-with-oac-data-replication/feed/ 0 297447