Smart Push is a Forms feature available in Planning, Financial Consolidation and Close, and Tax Reporting which permits the movement of data from one cube to another. This movement, or push, occurs instantaneously during execution (click here for options on configuring when the push occurs). Smart Push uses Data Maps to facilitate the mapping of source & target dimensions. Oracle provides a really good video explaining the configuration on their YouTube channel. I highly recommend checking it out!
Up until now, Smart Push has been limited to the cubes within a single application. But starting this month, customers have the ability to Push data across applications!
Setting this up is simple… within the Data Map definition click “Select Remote Cube” in the Cube Name drop down:
Please note, only the Planning Modules and FreeForm Planning are available as both a source & target. The other applications are target-only.
Data Maps now also provide the ability to define much more granular level mappings between source and target dimensions. This permits much more flexibility in data movements between cubes. The available types of mappings are as follows:
Additional information on configuring these types of mappings within Data Maps can be found here.
The EPM Task Manager provides customers the ability to monitor and report on the various activities within the Planning process. While Task List is a well known feature in Planning, the new EPM Task Manager permits the following advanced capabilities:
Additional information on setting up EPM Task Manager and its capabilities can be found here!
The EPM Task Manager can be enabled via the application creation wizard. Please note, applications currently using Task Lists will not be affected by this new feature.
]]>
Are you looking to supplement your Oracle EPM applications with enhanced analytics capabilities?
If you find yourself trying to achieve one of the following, this blog post introduces you to a Perficient solution that leverages the Incorta Direct Data Mapping technology.
Perficient has partnered with Incorta to provide integrated dynamic reporting across both Oracle EPM and ERP sourced data sets. The Incorta Oracle EPM Connector enables direct connectivity into Oracle EPM applications. Perficient’s Incorta blueprint enables you to leverage the Incorta EPM Connector to not only import EPM metadata and data into Incorta, but also model it together with ERP sourced detail transactions, such as General Ledger Account Balances, Journal Lines and Sales Invoices. The coexistence of EPM and ERP data sets in Incorta enhances financial reporting by providing richer insights across several financial applications.
How can Incorta Supplement Oracle EPM Reporting?
Following are some of the key benefits available in Perficient’s Incorta blueprint for financial analytics.
Implementing Incorta for Oracle EPM and ERP General Ledger and Sub-ledger reporting delivers several other benefits as well:
To get a closer look at this solution and see a live demo of the Incorta dashboard, you may reach me at mazen.manasseh@perficient.com.
]]>Financial analysts are often looking for a daily tracker of their organization’s sales. Daily sales tracking requires bench-marking against budgets and forecasts. However, it is often the case that plans are defined at a higher level compared to actuals. In addition, while actuals are available from ERP, plans are maintained in a separate application for Enterprise Performance Management (EPM).
If you would like to learn more how to consolidate daily sales tracking to incorporate actuals from ERP and plans from Oracle EPM, join this OATUG’s Online Forum session scheduled for next Tuesday.
In this session I will talk about how to leverage Oracle Analytics to directly source data from Oracle EPM Cloud or Hyperion. Several sample dashboards will be presented to enable dynamic reporting against ERP and EPM data. I will also go into how to connect to and model data from EPM (such as PBCS, HFM, Hyperion Planning and Essbase) together with Data Warehouse actual sales data from ERP or Sales.
Interested in enabling OAC or OAS/OBIEE reporting against EPM?
Check out my two other related blog posts:
See How Adding Oracle Analytics Can Elevate Your EPM Experience
Supplement Oracle EPM with Oracle Analytics and Autonomous Data Warehouse in 10 Weeks
More upcoming OATUG online sessions on Oracle Analytics and EPM are available here.
]]>Are you looking to supplement your Oracle EPM applications with enhanced analytics capabilities?
At Perficient, we have successfully implemented analytics solutions that complement both on-premises Hyperion applications as well as Oracle EPM Cloud business processes: Planning and Budget Cloud Service, Financial Consolidation & Close Service, and Essbase Cloud. Our methodology requires minimal data movement leveraging direct connectivity from Oracle Analytics Cloud (OAC) or Server (on-premises OBIEE).
Following are some of the common solutions we have helped establish for several of our clients:
What to Expect to Happen in 10 Weeks
The Perficient team can implement our solution offering in 10 weeks. Here is what you would expect to happen during that time:
Perficient’s service offering lays the foundation for an analytics road map that combines EPM, OAC and ADW in a complementary and value-add manner.
I will be presenting on a closely related topic: Daily Sales Tracking for Financial Analytics. If you are interested to explore your options with a similar solution, leave me a message.
You may also want to check out 6 must-attend sessions at Collaborate. Attend sessions, enter to win an Amazon gift card, and stop by booth #641 and drop off your entry form found in the OATUG Coupon Book for your chance to win an Oracle EPM Upgrade vs. Cloud Migration Workshop!
]]>The Oracle EPM suite provides best-in-class business process applications when it comes to activities such as financial planning, consolidation and period-end close. Organizations are often looking for ways to easily incorporate financial information, out of EPM applications, into consolidated reporting with other systems (such as ERP, Sales, HR, and others). Also, there is great value in enabling navigation from summarized financials out of EPM, to more detailed transaction-level data out of other applications. Together Oracle Analytics Cloud (OAC) and Autonomous Data Warehouse (ADW) supplement the Oracle EPM suite, therefore providing an integrated view of EPM financials with other applications data.
OAC is an enterprise-level analytics platform that is well-known for its highly governable data modeling capability, thus providing reliable and company-wide standardized reporting. At the same time, OAC offers the agility often needed by business analysts to perform self-service data integration, data preparation and reporting. And ADW offers the fastest time to value when it comes to data consolidation across multiple sources for cross-functional analytics.
At Perficient, we have successfully implemented several analytics solutions that seamlessly integrate with Oracle EPM. Our methodology requires minimal data movement, leveraging direct connectivity from OAC to either EPM Cloud suite or on-premises Hyperion. Following are some of the benefits that are key to this solution:
In my next blog, I will present our quick start service offering to supplement Oracle EPM with Oracle Analytics and Autonomous Data Warehouse.
I will be presenting on a closely related topic: Daily Sales Tracking for Financial Analytics. If you are interested to explore your options with a similar solution, leave me a message.
You may also want to check out 6 must-attend sessions at Collaborate. Attend sessions, enter to win an Amazon gift card, and stop by booth #641 and drop off your entry form found in the OATUG Coupon Book for your chance to win an Oracle EPM Upgrade vs. Cloud Migration Workshop!
]]>Using substitution variables to store folder directories and then referencing the variables in calc scripts/business rules has long been a best practice in the world of on-prem applications. For example, using a variable to store the path to a shared folder for a dataexport command.
DATAEXPORT “File” “|” “&ExportServer/Sample.TXT” “#mi”;
But cloud applications do not have visibility or access to the on-prem servers, folders, shared drives, etc. And so users may encounter the following error when launching a data export rule in PBCS:
Cannot calculate. Essbase Error(1005000): Ascii Backup: Failed to open…
Note: The ExportServer substitution variable is set to a directory on an on-prem server.
It’s worth noting that simply validating the rule will NOT identify the error (see below), the rule has to be launched in order to encounter the problem.
The rule(s) must be updated to place the export file in the Inbox/Outbox Explorer within PBCS. This can be accomplished by replacing the substitution variable with /u03/lcm/.
The users can download the file from the Inbox/Outbox Explorer and place in the appropriate location. Or better yet, an EPM Automate script can easily be written to facilitate the running of the rule and downloading of the file.
To read more on PBCS Business Rules, click here.
]]>About EPM Automate
EPM Automate Utility enables Service Administrators to automate many repeatable tasks, including the following:
You can create scripts that are capable of completing a wide array of tasks and automate their execution using a scheduler. For example, you can create a script to download the daily maintenance backup from environments to create local backups of your artifacts and data.
EPM Automate installation procedure
From the menu, Select Tools Install EPM Automate (For Windows)
EPM Automate can also be downloaded from the Simplified Interface
From the top right menu select Download
EPM Automate tools can be install on Linux and Windows! The step by step instructions can be found below.
Additionally, for this purpose of this demonstration, we will be using the windows installer.
1. Open your PBCS instance and login as normal
2. Access ‘Setting and Actions’ by clicking your user name in the top right-hand corner of the Home page.
3. Click Downloads.
4. In the Downloads page, click Download for Windows/Linux in the EPM Automate section.
5. When prompted, choose to save the file to your computer.
6. Once downloaded, right-click the ‘EPM Automate.exe’ file, and select Run as administrator.
7. Follow on-screen prompts to complete the installation.
By default, the EPM Automate Utility is installed in C:/Oracle/EPM Automate.
Reference for this article can be found here.
]]>With the shift to the cloud, on-premises Essbase applications may be migrated to Oracle Analytics Cloud (OAC) using a simple method provided by Oracle. Essbase Life Cycle Management process is cost effective and easy to use utility for EPM Professional or Technical lead to migrate the applications to the cloud without the need of recreating the application. Oracle Analytics Cloud – Essbase has provided the following process to migrate Essbase applications to OAC.
Oracle Analytics Cloud – Essbase provides a Lifecycle Management (LCM) command-line based utility that can be used to migrate Essbase on-premises application, folder and elements using a .zip file for import to the cloud services.
Supported releases:
• 11.1.2.4.0nn
• 11.1.2.4.5nn
• 12.2.1, and later
1. On the application homepage, Click UTILITIES and download the LCM utility (EssbaseLCMUtility.zip)
2. In the uncompressed downloaded file, run EssbaseLCM.bat (Windows) or EssbaseLCM.sh (Linux), based on the platform on which you want to run the utility.
3. Manually convert the on-premises Essbase application to Unicode before exporting it to a .zip file
Note: You can execute the file from any location against a remote cloud service instance.
1. Manually convert the on-premises Essbase application to Unicode before exporting it to a .zip file.
2. Run the LCM export command to download the on-premises Essbase application and its elements to the specified .zip file.
3. To import the application into the cloud service, use the Oracle Analytics Cloud – Essbase CLI tool to upload the .zip file to a target cloud service application.
4. Log into the cloud service to see the application and cube on the Applications home page.
The LCM import and export commands are described in Migrating Cloud Service Applications.
1. Download the LCM utility from the cloud service and execute it in the system running Essbase 11.1.2.4.nnn.
2. Export the required source application using the utility. This includes application-level elements.
3. In the exported .zip file, if a partition exists in the source application, then edit the partition XML to correct any partition settings.
4. Import the .zip file into the cloud service using the CLI tool.
Note: You must move server-level substitution variables, which are used in on-premises applications, to the application level prior to import.
5. After importing using CLI, perform the following in the Oracle Analytics Cloud – Essbase user interface:
• Assign calculation scripts to relevant users
• Assign cloud-based user roles to users
In a recent Perficient webinar, How Western Alliance Bank is Innovating with Oracle Cloud, attendees were treated to not one, but two speakers from Western Alliance Bank! Andrew Boucher, vice president market risk management, and Valentino Hafalia, vice president of FP&A discussed the challenges with the legacy environment and benefits realized with consolidated analytics as well as the ability to trend KPIs. Mitch Campbell, Director Cloud Analytics at Oracle, wrapped up the presentation by taking an in-depth look at Oracle Analytics Cloud. This information packed on-demand event is available now for download, no registration required!
In this blog post, I’ll share a few questions asked by attendees and answered by Andrew as well as Mitch.
Which other tools did you consider when evaluating a business analytics platform?
[Andrew Boucher] We went through several months of looking at various vendors prior to picking BICS. Tableau was under serious consideration. You may have heard Val has a little background and is a fan of Tableau. We also had our in-house product SAS. It’s more of a statistical programming tool, good for statistical analysis, but not really for bridging disparate data sources. Tableau was a very good tool to use in that space and came in a very close second. We were actually going to look at Microsoft Power BI, but we were not on a platform that could run that right away. Those were the major tools. There were a few others that were in the box, but we didn’t end up seeing a demo.
We chose Oracle largely because we were already on the cloud with Planning and Budgeting Cloud Service (PBCS) and when we spoke to Oracle about how much we could bridge that with the data we had in a multi-dimensional database and that we could merge with the data sources from the relational databases into the warehouse, BICS seemed to be the best solution for us.
Did you have KPIs documented when you set out to start the engagement?
[Andrew Boucher] With Perficient, yes. We worked with a proof of concept. We wanted to prove to our executives that this is something we should take a hard look at. The initial KPIs were everything that Val showed you, which is basically pricing. When we talk about banking, a lot of it is pricing. The earnings and pricing and yields we have on the assets that we own versus the liabilities that we pay on were key KPIs we wanted to do at the beginning of the engagement. Val was going through the top of the house profitability metric…Where do we earn money? How do we earn money? How efficient are we as far as expenses? That is our asset quality concern, basically the quality of the asset. How much could we actually lose in principal? Those KPIs we defined with Perficient.
Val has taken it to a different level with regard to the efficiency side – we started to bridge after the engagement. For the evaluation phase, one of the more interesting things we did was to present Perficient and a few other vendors with a set of data and graphical representations that we felt very comfortable with. They were very difficult waterfall and double bar line graphs and asked them if they could duplicate it in the BI tool.
We’re an existing Oracle Business Intelligence Enterprise Edition (OBIEE) customer, can you describe the process to migrate from on-premises to Oracle Analytics Cloud?
[Mitch Campbell] There’s a phrase that hopefully you have heard before that we tend to discuss fairly frequently, which is simply known as ‘lift and shift.’ With the more recent releases of OBIEE, we started including the ability to ‘lift and shift’ everything from on-premises easily into the cloud with a BI archive file or BAR file. If you have a more recent version of Oracle Business Intelligence (OBI), chances are you are going to be able to take advantage of doing that quickly and easily. It might be as simple as getting a cloud instance operational, whether it’s a trial or you’ve purchased and you’re now provisioned and ready to get started. You can actually use that BAR file and simply import and it will take everything that you’ve designed not just the RPD, not just the on-premises OBIEE content, but also anything that’s in the catalogs, all the visualizations, all the charts and graphs and very quickly and easily move those directly to the cloud. The same can happen with Essbase applications. If you’re an Essbase user on-premises, you can lift and shift with a command line. Simply move your Essbase cubes directly to OAC Essbase Cloud. We’ve spent a lot of time doing that homework behind the scenes to make it an easy process and we’ve helped a lot of customers try that out as they’re starting to evaluate moving to the cloud.
Of course everyone’s data sets are individual and there may be certain things that you want to test and you’d leverage someone like Perficient to make sure you are managing the process of getting everything migrated efficiently and quickly, but we’ve made it easier than it ever has been before. I think that’s an area we feel strongly isn’t the hardest part.
I think people have a lot of heartburn from doing significant upgrades on-premises over the last 10-15 years. There’s usually a big sigh and a dread of relief that it took us 4-5 months. It’s simply not that hard in the cloud. Being able to move your content and getting back to tie out the numbers and seeing them in the cloud is much easier than it ever has been before. As Val mentioned, there are some really nice tools like Data Sync that if you want to set up new connections or go back to those other sources and maybe make additional data feeds into your system, it’s really easy to do that as well.
]]>This is a continuation of my special blog series. You can read Part 1 here.
Groovy script is basically split into four sections. We previously covered a short description of the sections, so we are getting directly into the script here.
This section imports the java libraries which help in formatting the payload and reading the JSON Response from server.
import org.json.JSONObject
import groovy.json.JsonSlurper
JSONObject class is used to use the put method which pushes the parameters to payload.
JsonSlurper is a class that helps to convert JSON data into groovy data.
Setting up user Credentials and other information related to REST API.As per oracle below are the expected format
1. API Version – 3
2. username – domain.username:password
3. authentication – HTTP Requests require Basic Authentication “Basic ” + javax.xml.bind.DatatypeConverter.printBase64Binary
This section defines the list of common functions which are already developed by Oracle. Readers should have basic understanding of Java language to understand the groovy functions. Below are the list of common functions which are provided by Oracle and used in our example.
fetchResponse – method used to read the response and convert into lines.
fetchJobStatusFromResponse – method used to read the job status of business rule submitted.
Below are the custom methods used specifically to call the business rule.
executeRequest
1. Setup the connection using the EPBCS URL.
2. Authenticate user with credentials.
3. Set request properties and Request Type as POST.
4. Invoke the url and read the status code.
5. Status code is expected to be 200 for GET and 201 for POST. If there are any other status codes then please refer to Oracle Error Codes.
executeBusRule
1. Format the URL string to be passed.
2. Split the parameters if there are more than 1 parameter.
Program explained in this blog has only 1 parameter but I have added the code to make sure the method supports No param or more than 1 parameters.
3. Call the executeRequest explained above.
This is the last main section which actually calls the business rule with parameters. Below screenshot very clearly shows that Push_Data rule is invoked with a parameter Entity.
Code can be reused with few tweaks in Section 2 and Section 4. Validate and run the program in the groovy console to invoke the business rule. Additional logic for deriving the parameters can be added in Section 4. Below screenshot shows the email sent by EPBCS on successful completion of business rule.
Introduction
EPM Automate plays a vital role in automating many admin and user tasks in Oracle Hyperion Cloud. This blog post demonstrates how EPM Automate can help in troubleshooting the issues during support phase. Basically, this method is useful when we have more than one instance like Development and a SIT/UAT/Prod instance. The following explains the scenario and provides the complete automation script which can be directly used in similar instances.
Scenario
The following approach is based on the assumption that there is a development instance and a UAT Instance where a user reports any issues. Below is the entity dimension:
During UAT/Prod testing we may have issues reported by various users which may be data issue or Business rule issue. Consider a user who supports the America Entity to have some issue with a Business rule; then below are some of the operations that cannot be performed in the instance user reported:
1. Updating data directly in the forms.
2. Running any business rule as it may update the user data.
3. Taking a level zero backup and update the Development environment.
To elaborate further on point #3, we cannot just copy the level zero backup data from UAT/Prod to Development instance as it may affect other developers’ data.
How do we then resolve this problem that makes user comfortable?
We have two prerequisites to implement this solution. They will not be covered in extensive detail but well enough to make convey the idea behind this solution.
Below are the two prerequisites:
Prerequisite-1: Business Rule (User Reported Instance)
Create a business Rule which accepts Entity and Month as run-time prompt. The Business rule should try to extract the level zero data from instance user reported. Store the file in the Inbox FDMEE Folders.
**************************************************************************************************************************
SET DataExportOptions
{
DATAEXPORTCOLFORMAT ON;
DATAEXPORTDIMHEADER ON;
DATAEXPORTDRYRUN OFF;
DataExportRelationalFile OFF;
DataExportNonExistingBlocks OFF;
DataExportLevel ALL;
DATAEXPORTCOLHEADER “Period”;
DATAEXPORTOVERWRITEFILE ON;
DataExportDynamicCalc OFF;
DataExportDecimal 2;
};
FIX (@Relative(“Account”,0),{Entity},{Month},@Relative(“Scenario”,0),@Relative(“Version”,0),”FY18″,@Relative(“Projects”,0))
DATAEXPORT “File” “,” “/u03/inbox/inbox/EPM/export_entity.csv” “#”;
ENDFIX
**************************************************************************************************************************
Prerequisite-2: FDMEE Data Rule (Development Instance)
Create a data rule in FDMEE to load the data extract file to Development instance. Below are the steps to create the data rule:
1. Create an Import format based on the Extract file created in Prerequisite-1.
2. Create data mappings to map the source and target dimension.
3. Create a data rule to load the data into development instance.
After setting up the prerequisite now its the time to write the automation batch script. Below is the steps to be written in the Batch script
1. Enter the dimension information to be passed to Extract business Rule.
2. Run the business Rule to extract the data for specific Entity.
3. Download the data file from UAT/Prod Instance.
4. Upload the data file to development instance.
5. Run the data rule to load the data.
The following script is completely tested script and it works well in our PBCS environment. Also the above steps are marked in the comment section in the below EPM Automate Script for better understanding.
EPM Automate Script
**************************************************************************************************************************
REM SET variables for development and UAT Instance.
SET p_url=https://fastforward-perficient.pbcs.us2.oraclecloud.com
SET p_domain=perficient
SET p_user= don.ford@perficient.com
SET p_password=xxxxx
SET d_url=https://fastforward-test-perficient.pbcs.us2.oraclecloud.com
SET d_domain=perficient
SET d_user= don.ford@perficient.com
SET d_password=xxxxxx
CALL epmautomate login %p_user% %p_password% %p_url% %p_domain%
REM Get the parameters for the dimension – Step 1
SET /P Entity=Enter Entity:
SET /P Month=Enter Month:
REM Call the Extract data rule to get the data of the Entity for the specific month. – Step 2
CALL epmautomate runbusinessrule Extract Entity=%Entity% Month=%Month%
REM Download the file – Step 3
CALL epmautomate downloadfile inbox/EPM/export_entity.csv
CALL epmautomate logout
REM Login to Development environment
CALL epmautomate login %d_user% %d_password% %d_url% %d_domain%
REM Upload the file – Step 4
CALL epmautomate uploadfile export_entity.csv inbox/EPM
REM Call the Data load rule to copy the data to Development Environment – Step 5
CALL epmautomate rundatarule Budget_load %Month% %Month% REPLACE REPLACE_DATA export_entity.csv
CALL epmautomate logout
**************************************************************************************************************************
Above script may require few modification based on the Dimension and FDMEE Data folders. Script has to be saved as *.bat file in your local machine and run the script in Admin mode.
Steps to run the script
1. Save the EPM Automate script *.bat file.
2. Run the script in Admin mode.
3. Enter the Entity America and Month Jan.
4. Script gets completed successfully.
Data specific to the entity America gets copied to Development instance. The developer can start working on the issue the user is facing in UAT/Prod Instance.
Key Benefits
1. Developers can test the Business rule/Data Issue in development with similar data in which user reported the issue.
2. User can see the results before the changes are pushed to UAT/Prod.
3. No need to overwrite all level zero data in any instances.
Our client is a leading healthcare provider with 13 clinical departments, each with multiple entities and varying degrees of complexity. They implemented Hyperion Planning in 2012 and experienced significant growth since then as a result of expansion into the community and in their main campus. There was a growing need to provide the executive office with greater insight into financial performance and more flexibility in budgeting. Too much time was being spent processing data rather than analyzing it.
On-premises challenges included various spreadsheets generated and distributed by corporate, which caused bottlenecks getting data into Hyperion. The provider didn’t have “real-time” budgeted bottom line. Instead, business rules were bundled into one all-inclusive rule set only run twice a day during the budget cycle as it took 2 to 3 hours to complete. A complicated matrix of forms for user input resulted in inconsistent naming conventions and unused forms and business rules among other things. In addition, the finance team received little support from IT to keep the environment current.
We implemented Oracle Planning and Budgeting Cloud Service (PBCS), delivering a single integrated platform across corporate and practices to streamline the planning process as well as centralized common reporting and administration. The new planning system improved the user experience, refined labor planning processes, and allowed for a flexible driver-based budget. Read the full story to learn how Oracle PBCS can enhance financial and workforce planning.
]]>