EPM Articles / Blogs / Perficient https://blogs.perficient.com/tag/epm/ Expert Digital Insights Thu, 07 Mar 2024 19:10:43 +0000 en-US hourly 1 https://blogs.perficient.com/files/favicon-194x194-1-150x150.png EPM Articles / Blogs / Perficient https://blogs.perficient.com/tag/epm/ 32 32 30508587 Monarch Bioenergy Leverages Oracle Cloud to Support Future Growth https://blogs.perficient.com/2024/03/05/monarch-bioenergy-leverages-oracle-cloud-to-support-future-growth/ https://blogs.perficient.com/2024/03/05/monarch-bioenergy-leverages-oracle-cloud-to-support-future-growth/#respond Tue, 05 Mar 2024 21:21:41 +0000 https://blogs.perficient.com/?p=358364

Monarch Bioenergy needed to create core Finance, Supply Chain, and Project Management processes that would support alternative energy commodity billing and operations.

Achievements:

  • Implemented Oracle Cloud ERP Financials, Oracle Cloud SCM Procurement and Inventory, Oracle Project Portfolio Management, and Oracle Cloud EPM.
  • The solution went live in 2023 after a seven-month implementation using Perficient’s Agile Based Cloud Implementation Framework.
  • Sunset Legacy processes in ViewPoint
  • Transformed and converted Inventory Items, Customers, Suppliers, Fixed Assets & Projects from ViewPoint to Oracle Fusion
  • Oracle Procure to Pay solution was implemented to support of Monarch’s business process
  • Oracle Project Portfolio Management was implemented to facilitate the creation, control, and tracking of Projects

Key Benefits:

  • Monarch has a clear view into business operations, improved end-of-month revenue processes, and streamlined procurement procedures to scale with future growth
  • Improved Visibility into Capital Projects
  • Streamlined Revenue Forecasting
  • Allowed Monarch to determine true cost of executing projects
  • Ability to monitor project performance
  • Capture CIP costs for capital assets.
]]>
https://blogs.perficient.com/2024/03/05/monarch-bioenergy-leverages-oracle-cloud-to-support-future-growth/feed/ 0 358364
Simplify Your Workflow: Streamlining Business Processes with EPM Pipeline Scheduling https://blogs.perficient.com/2023/10/19/simplify-your-workflow-streamlining-business-processes-with-epm-pipeline-scheduling/ https://blogs.perficient.com/2023/10/19/simplify-your-workflow-streamlining-business-processes-with-epm-pipeline-scheduling/#respond Thu, 19 Oct 2023 14:47:02 +0000 https://blogs.perficient.com/?p=345018

If you have worked with EPM Pipelines, you know how incredibly relevant it is for customers who do not have the resources for data integration automation. It has been a saving grace for streamlining tasks and data load processes within the EPM application itself. However, one important element that had been missing from it was a comprehensive scheduling option. Many of us found ourselves eagerly waiting for Oracle to introduce a scheduling option that could give way to complete automation. In the September 2023 update for the Freeform Planning and Planning modules, Oracle added a Pipeline job type to the Jobs Scheduler making data load automation possible.  

 

Here’s how to go about it: 

 

After creating a Pipeline process for Data Integration in Planning Data Exchange (more on this in my recent blog), you can schedule it as an Integration Pipeline job. Here’s how I scheduled my Data Load Pipeline process to run daily and perform multiple actions in a sequence.  

 

To schedule an Integration Pipeline job

  • In your Freeform, Planning, or Planning Modules application, click ‘Jobs‘ under ‘Application‘.

E1

  • Click ‘Schedule Jobs‘.

E2

  • A new Job type is available: Integration Pipeline.

E2

  • Define a daily schedule for the Job, provide a name and click ‘Next‘.

E4

  • Select the Integration Pipeline that you want to schedule.

E6

  • Select the parameters required for the Pipeline in this schedule. Click ‘Ok‘.

E7

  • Click ‘Next‘.

E8

  • Review the scheduled job and click ‘Finish‘.

E9

  • The scheduled Pipeline is ready to go!

E10

And there you have it. Not only can you create a Pipeline with multiple types of actions required to maintain your application, but also schedule that entire process to run in an automated fashion.

 

]]>
https://blogs.perficient.com/2023/10/19/simplify-your-workflow-streamlining-business-processes-with-epm-pipeline-scheduling/feed/ 0 345018
Creating Controls on Process Execution using Pipelines Continue-Stop Options https://blogs.perficient.com/2023/09/28/creating-controls-on-process-execution-using-pipelines-continue-stop-options/ https://blogs.perficient.com/2023/09/28/creating-controls-on-process-execution-using-pipelines-continue-stop-options/#respond Thu, 28 Sep 2023 14:12:10 +0000 https://blogs.perficient.com/?p=344761

You can create a series of stages in an EPM pipeline, where each stage contains one or more jobs that perform actions like data loads, rule executions, file operations etc. By organizing these jobs and stages in a single Pipeline, EPM customers can streamline their business processes into executable “workflows”.

 

But what if one of the jobs fails causing the corresponding stage to fail? There could be several reasons for that to happen – a connection issue, invalid credentials, or a data issue. Whatever the reason is for a failure in a Pipeline, the whole workflow comes to a stop. Now this may be exactly what needs to happen. If the first step of the process has an issue, the user may not want the process to proceed. Conversely, the process may still need to proceed, despite a failure at any stage in the Pipeline. There may be another case where the user just wants to bypass a certain step in the process regardless of whether it is going to be a success or a failure.

 

The good news is that there is a way to design your Pipeline to work for any and all of these scenarios! You can use “Continue” and “Stop” options while creating the stages in your Pipeline to determine how the Pipeline process should progress.

 

These options are available only in Enterprise Profitability and Cost Management, Financial Consolidation and Close, FreeForm, Planning, Planning Modules, Tax Reporting as of September 2023.

 

Using Pipeline Execution Options

To create execution controls on the stages of your Pipeline

  • From the Data Integration home page under Data Exchange, click D1 to the right of the Pipeline you want to edit, and then select “Pipeline Details“.

D2

  • Click on the Stage in your Pipeline to which you want to apply execution controls. The Editor box opens on the right. The first job in the below-selected Stage is ‘RPT Actuals Load’.

D3

  • You will see two new options: “On Success” and “On Failure”. You can define how the Pipeline should progress when the current Stage fails and when it is a success. Click “On Success”.

D4

  • In my case, if the ERP Actuals load to the RPT cube of my Planning application is a success, I can do one of the following: a) Continue to the next Job in this Stage i.e., ‘FS Actuals Load’, b) Stop the Pipeline execution at this point, c) Skip to the next Stage in the Pipeline i.e., ‘Actuals Copy to Forecast’, or d) Skip to the last Stage in the Pipeline i.e., ‘Calculate Balance Sheet and CF’.
  • Make the required selection. Then click “On Failure”.

D5

  • If the Actuals load to RPT fails, you can do one of the following: a) continue to load Actuals to the FS cube i.e., ‘FS Actuals Load’, b) Stop the Pipeline Execution at this point, c) Skip the Actuals load Stage altogether and proceed to the data copy Stage i.e., ‘Actuals Copy to Forecast”, d) Skip to the last Stage in the Pipeline ‘Calculate Forecast Balance Sheet & CF’.
  • This way when you have a failure in any stage of your Pipeline, you can still choose to perform other actions in your business process that are not impacted by the failed step, or you can also redirect the Pipeline execution to a “cleanup” step like notifying Administrators of the failure via an e-mail.
  • Make the required “On Failure” selection. The Pipeline is auto-saved.

D6

Using these Execution controls will give users of EPM Pipelines incredible flexibility in managing their business processes without having to use complex scripting our external tools.

]]>
https://blogs.perficient.com/2023/09/28/creating-controls-on-process-execution-using-pipelines-continue-stop-options/feed/ 0 344761
Mastering Data Integration: Unveiling EPM Pipeline’s Cutting-Edge Features https://blogs.perficient.com/2023/09/28/mastering-data-integration-unveiling-epm-pipelines-cutting-edge-features/ https://blogs.perficient.com/2023/09/28/mastering-data-integration-unveiling-epm-pipelines-cutting-edge-features/#respond Thu, 28 Sep 2023 14:09:46 +0000 https://blogs.perficient.com/?p=344679

EPM Pipelines is quickly becoming a very useful addition to the arsenal of many of our Oracle customers. It is especially important for those users who do not have dedicated personnel or server-related resources to automate their daily business processes. In my recent blog, I detailed how to create a Pipeline to perform data load related activities in a workflow. In this blog, I will discuss a few additional features in Pipelines that will help you enhance your data integration experience.

Clear Cube Job Type

Users can now use a Clear Cube job type to clear all or specific data from a cube. With this job type, you can perform the following:

  • Clear data using member selection
  • Clear data using MDX query
  • Clear supporting details and comments
  • Clear attachments
  • Choose to clear physical or logical data
  • Use run-time prompts to define clear regions

 

Note that this job type is available only in Financial Consolidation and Close, FreeForm, Planning, Planning Modules, Tax Reporting.

 

To create a Clear job type

  • Click B1 to create a new job in the Stage of the Pipeline to which you want to add a Clear job.

B2

  • Select ‘Clear Cube‘ from the Job Type drop down.

B3

  • Select the cube to clear from the ‘Name’ drop down.

B4

  • Provide a ‘Title’ to the job.

B5

  • Optionally, add run time labels and values.

B6

  • The Pipeline is auto-saved.

B7

  • Run the updated Pipeline by clicking the play button.

B8

 

File Operations Job Type

Users can use the File Operations job type to run the following operations at runtime:

  • Copy a file – copies the file from a source directory to a target directory and retains the original file in the source directory after the copy operation to a target directory
  • Move a file – moves the file from a source directory to a target directory, but does not retain the moved file in the source directory after the move operation to a target directory
  • Unzip a file – Unzips a file in the same folder

 

Note that this job type is available only in Enterprise Profitability and Cost Management, Financial Consolidation and Close, FreeForm, Planning, Planning Modules, Tax Reporting.

 

To create a Move File Operations job type

  • Click B1 to create a new job in the Stage of the Pipeline to which you want to add a Move job.

B9

  • Select a ‘Connection‘ and provide a ‘Title’.

B10

  • Make the File operation parameter selections.
    • File Operation: Copy, Move or Unzip.
    • Source Directory: directory from which to copy, move, or unzip the file.
    • Source File Name: name of the file to copy, move, or unzip.
    • Target Directory: directory to which files are copied. The target directory can be: inbox, openbatch, openbatchml, and epminbox.
    • Target File Name: name of the file that has been copied, moved, and unzipped.
  • A target file name is not required for an “Unzip” file operation

B11

  • The Pipeline is auto-saved.
  • Run the updated Pipeline by clicking the play button. The file is moved to the target directory.

The Clear Cube and File Operations can be great additions to your EPM Pipelines for maintaining data in the cubes or file-based data integrations.

]]>
https://blogs.perficient.com/2023/09/28/mastering-data-integration-unveiling-epm-pipelines-cutting-edge-features/feed/ 0 344679
Stay Agile with EPM Pipeline: Unleash Dynamic Substitution Variables https://blogs.perficient.com/2023/09/28/stay-agile-with-epm-pipeline-unleash-dynamic-substitution-variables/ https://blogs.perficient.com/2023/09/28/stay-agile-with-epm-pipeline-unleash-dynamic-substitution-variables/#respond Thu, 28 Sep 2023 13:59:10 +0000 https://blogs.perficient.com/?p=344695

Pipeline variables are a way to feed runtime parameters to your Pipeline jobs. They can be selected from a predefined set of out-of-box variables or custom-defined. You can define values for these variables that are then used during run time by individual jobs (Read more about creating Pipeline variables in my recent blog). In this blog, I am going to discuss a neat trick for deriving your pipeline variable values from the application Substitution Variables. This way you can make your run time values dynamic instead of using a hard-coded value.

 

This option is available only in Financial Consolidation and Close, Planning, Planning Modules, and Tax Reporting as of September 2023.

Note that the substitution variable must be defined for all cubes to be used as an input value.

 

To derive Pipeline Variable value from a Substitution Variable

  • From the Data Integration page under Data Exchange, click the Pipeline to which you want to add a new variable.

C1

  • On the Pipeline page, click C3 to edit the Pipeline definition.

C2

  • Click the Variables tab and click C4 to add a new Pipeline variable.

C5

  • In the new row added at the end of the variable list, provide a Variable Name (up to 20 characters, no spaces), a Display Name for the prompt, and a Display Sequence
  • Check Required if a value for the substitution variable is required to execute the Pipeline.

C6

  • From the validation Type drop-down, select ‘Text‘.

C7

  • For the Default Value, enter &<Substitution Variable Name>Where the Substitution Variable Name is the name of the Substitution variable defined in the Planning application.C8

C9

  • Click ‘Save‘.

C10

  • When you run the Pipeline, the Current Mnth variable value is derived from the CurMonth Substitution variable.

C11

That is how you can use the Substitution variables wherever you can in your EPM Pipeline definition so that the process works for the right period, year, scenario, version etc. in a given Plan cycle.

 

 

 

 

 

]]>
https://blogs.perficient.com/2023/09/28/stay-agile-with-epm-pipeline-unleash-dynamic-substitution-variables/feed/ 0 344695
OCI Gen 2 Refresh Token Setup – TROUBLESHOOTING https://blogs.perficient.com/2023/06/09/oci-gen-2-refresh-token-setup-troubleshooting/ https://blogs.perficient.com/2023/06/09/oci-gen-2-refresh-token-setup-troubleshooting/#respond Fri, 09 Jun 2023 20:59:31 +0000 https://blogs.perficient.com/?p=337430

In Oracle Cloud Infrastructure (OCI) / Gen 2 architecture, OAuth 2 access tokens are used to issue REST APIs on EPM Cloud to fulfill the requirement of avoiding the use of passwords in the environment. After the REST client has been registered the Refresh token is obtained using the Identity Cloud Service URL (IDCS URL), scope, and client ID. Details of the process can be found in the Oracle document.

While working on setting Refresh token for a client, we worked our way through few errors and thought of sharing the knowledge with everyone. Below is an elaboration on the errors that can occur during the process and their resolution.

ERROR 1:

When logging into epmautomate following error can occur.

Error 1

Cause: This error occurs because the job tries to access feature that runs on .NET 2.0 Framework which is not available in .NET 4.0

Resolution: Install .NET Framework 3.5 feature.

ERROR 2:

When issuing the request to the IDCS URL for generating device code, the following can occur.

Picture2

Picture3

 

Cause: This error occurs because the “cmd /c” syntax was not used before curl command.

Resolution: Use the “cmd /c” before curl command.

ERROR 3:

When issuing the request to the IDCS URL for generating device code, the following error can occur.

Picture4

Picture5

 

Cause: This error occurs because the “cmd/c” is used instead of “cmd /c” (notice the space after “cmd”).

Resolution: Use the correct syntax for “cmd /c” before curl command.

ERROR 4:

The request to IDCS URL generates a response that contains a device code, user code, and verification URI. On logging into verification URI, it prompts for ‘Device Log In’ and user code from the response payload should be entered. The following error appears when you submit the code:

Picture6Picture7

Cause: This error occurs if you press the ‘ENTER’ key after inputting the code.

Resolution: Click the ‘SUBMIT’ button after inputting the code. Do not press ‘ENTER’.

 

]]>
https://blogs.perficient.com/2023/06/09/oci-gen-2-refresh-token-setup-troubleshooting/feed/ 0 337430
Oracle EDM and FreeForm Planning https://blogs.perficient.com/2022/09/29/oracle-edm-and-freeform-planning/ https://blogs.perficient.com/2022/09/29/oracle-edm-and-freeform-planning/#respond Fri, 30 Sep 2022 01:00:32 +0000 https://blogs.perficient.com/?p=319417

Oracle’s September new features list for EDM (Enterprise Data Management) contains quite a few exciting items. One particular new feature that Oracle EPM customers should be aware of is EDM’s ability to support FreeForm Planning applications. Starting this month, EDM now contains an new application type specifically for FreeForm Planning applications!

Edm Apptypes

 

The application registration process for FreeForm applications is very similar to that of regular Planning applications. The wizard directs the user through identifying the Cube(s), Application Settings, and Dimensions. Continuing reading below as we walk through an example!

 

 

First, the names of each cube that EDM will support needs to be defined.

Edm Freeform2

 

Next, the user needs to configure the application settings.

Edm Freeform3

 

After application settings, the dimensions which will be managed within EDM need to be added.

Edm Freeform4

 

Example of adding an Account dimension.

 

Edm Freeform5

 

Click Next once all dimensions have been added.

Edm Freeform6

 

Finally, the user is presented with a summary page of the configuration.

Edm Freeform7

 

 

Lastly, customers who previously used the Planning app type with EDM for FreeForm applications, can easily be convert those to the new FreeForm app type if needed.

 

]]>
https://blogs.perficient.com/2022/09/29/oracle-edm-and-freeform-planning/feed/ 0 319417
Key Features Recently Made Available in Oracle Enterprise Data Management https://blogs.perficient.com/2022/08/02/key-features-recently-made-available-in-oracle-enterprise-data-management/ https://blogs.perficient.com/2022/08/02/key-features-recently-made-available-in-oracle-enterprise-data-management/#respond Wed, 03 Aug 2022 01:30:56 +0000 https://blogs.perficient.com/?p=315205

The most recent release of new features for Oracle Enterprise Data Management contain some great capabilities. Continue reading below to find out more!

 

Viewpoint Querying

The ability to search for nodes within Oracle EDM has been greatly enhanced with the recent addition of Viewpoint Queries. Now, users can easily find nodes within a viewpoint where properties match specified criteria. Below are some important considerations for those users looking to use this feature.

  • All of the nodes in any viewpoint or only those below a selected node in a hierarchy viewpoint can be queried.
  • Filters can be added to narrow the scope and limit the number of query results returned for each query.
  • Queries are limited to the following properties:
    • Properties with defined values which use a Default Type = None
    • All property data types except for the Node data type
    • Certain properties in the Core and CoreStats namespaces which have stored values (e.g., WHO columns, name, description, alternate name)
    • Derived and Inherited properties are not available for selection  (bummer! hopefully this is enhanced in later releases)

Edm Query

 

User-Friendly History for Data Integrations

Logging the execution of data integrations is a long-standing best practice for any data management solution. Oracle has taken EDM’s ability to the next level by displaying the historical executions of imports, exports, and extracts directly to the end user.

Edm Exporthistory

 

Uniqueness Constraint Expanded to Node Types

Oracle EDM now provides the ability to ensure property values are unique across Nodes Types (previously uniqueness constraints were available for only application and dimensions). This feature can be used to prevent duplication across descriptions, aliases, or any other defined property.

A few things to note:

  • Planning, Planning Modules, and Financial Consolidation and Close applications already have application-specific validations that check for node name uniqueness across node types at the application level. Customers do not need to create a constraint to enforce this data rule for those application types.
  • Oracle Financials Cloud General Ledger applications already have application-specific validations that check for node name uniqueness across node types at the dimension level. Customers can add a constraint at the application level if desired to apply the node name uniqueness rule across segment value sets.
  • Properties referenced by constraints must be a node level property, must be string data type, cannot have a derived default value nor be an inherited value.

 

Subscription Requests Item Limit Increased

Until now, the item limit within a request, regardless of request type (interactive or subscription-generated) was been 10,000. However, the limit for requests generated from subscriptions is now 12,000. This will help to ensure the synchronization of descendant nodes between source & target viewpoints when parent nodes are moved/inserted.


Enhanced Controls on Email Notifications

There’s few things more annoying than being flooded with emails/notifications which you care little about. This is especially true in our professional lives. EDM now provides several new system settings to help organizations better manage emails originating from requests in the application.

  • Enabled/disabled emails entirely.
  • Auto-add a prefix to email subjects to identify the environment from which a notification has been sent.
  • Identify a substitute recipient who should receive all notifications instead of the primary recipient(s). (Really useful for testing workflows within a development environment and helps to avoid confusion for end users).

Edm Emailsettings1

]]>
https://blogs.perficient.com/2022/08/02/key-features-recently-made-available-in-oracle-enterprise-data-management/feed/ 0 315205
Important April Updates for Oracle EPM Cloud Customers https://blogs.perficient.com/2022/04/15/important-april-updates-for-oracle-epm-cloud-customers/ https://blogs.perficient.com/2022/04/15/important-april-updates-for-oracle-epm-cloud-customers/#comments Fri, 15 Apr 2022 23:00:57 +0000 https://blogs.perficient.com/?p=308108

BETTER DATA MOVEMENT ACROSS APPLICATIONS

Smart Push is a Forms feature available in Planning, Financial Consolidation and Close, and Tax Reporting which permits the movement of data from one cube to another. This movement, or push, occurs instantaneously during execution (click here for options on configuring when the push occurs). Smart Push uses Data Maps to facilitate the mapping of source & target dimensions. Oracle provides a really good video explaining the configuration on their YouTube channel. I highly recommend checking it out!

Up until now, Smart Push has been limited to the cubes within a single application. But starting this month, customers have the ability to Push data across applications!

Setting this up is simple… within the Data Map definition click “Select Remote Cube” in the Cube Name drop down:

Datamapremotecube1

Please note, only the Planning Modules and FreeForm Planning are available as both a source & target. The other applications are target-only.

 

DATA MAPS FEATURE EXPANSION

Data Maps now also provide the ability to define much more granular level mappings between source and target dimensions. This permits much more flexibility in data movements between cubes. The available types of mappings are as follows:

  • Simple Mappings: One to One mappings between source and target members
  • Roll-up Mappings: Multiple members on a source with a target member
  • Multi-Dimension Mappings: Mapping two source dimensions to one target dimension or one source dimension to two target dimensions.
  • Substitution Variable Mapping: Substitution Variable can be referred to during the execution of the data map.

Additional information on configuring these types of mappings within Data Maps can be found here.

 

IMPROVING PLANNING TASK MANAGEMENT

The EPM Task Manager provides customers the ability to monitor and report on the various activities within the Planning process. While Task List is a well known feature in Planning, the new EPM Task Manager permits the following advanced capabilities:

  • Define the tasks and schedule to ensure the most efficient task flow
  • Automate the business process, track status, and provide notifications and alerts
  • Notify users by email for delinquencies, due dates, status changes
  • Monitor business process status from a dashboard
  • Act quickly to fix errors and delays
  • Analyze the effectiveness of the business process

Additional information on setting up EPM Task Manager and its capabilities can be found here!

The EPM Task Manager can be enabled via the application creation wizard. Please note, applications currently using Task Lists will not be affected by this new feature.

 

]]>
https://blogs.perficient.com/2022/04/15/important-april-updates-for-oracle-epm-cloud-customers/feed/ 1 308108
Where to Start for New Oracle Enterprise Data Management Customers https://blogs.perficient.com/2022/02/07/where-to-start-for-new-oracle-enterprise-data-management-customers/ https://blogs.perficient.com/2022/02/07/where-to-start-for-new-oracle-enterprise-data-management-customers/#respond Tue, 08 Feb 2022 02:30:06 +0000 https://blogs.perficient.com/?p=304063

We’ve all been there…our organization has decided to implement a new technology and we’ve been tapped to ensure it’s success. After the feelings of anxious excitement subside, the first question that typically pops into our head “where do I start?” Official (and many times expensive) training is typically a person’s first thought, along with reading available documentation online (if the vendor provides it online). While these options are extremely beneficial, sometimes it’s still a challenge for users to filter their education down to the basics in order to reap the benefits of the solution from day 1.

Luckily, for Enterprise Data Management (EDM) customers Oracle has recently released several checklists to help users of all roles to do just that! Whether the person is an Admin, a data owner, or even an Auditor, these checklists provide a path to success for all EDM user types.

Edm Checlist 01

 

Clicking on the hyperlinks opens each individual checklist. The checklists have detailed step-by-step explanations of tasks for users. Within the checklists are additional links which redirect users to documentation, videos, etc. An example of the “Administer Enterprise Data Management Cloud” checklist is presented below.

Edm Checlist 02

 

The checklists even take it to the next step by providing information on where to find more about EDM, such as online user communities and Oracle user groups.

Edm Checlist 03

 

These checklists are very informative. So much so that they have something to offer for every user, regardless of level of expertise. We highly recommended that experienced customers take advantage of them as well!

]]>
https://blogs.perficient.com/2022/02/07/where-to-start-for-new-oracle-enterprise-data-management-customers/feed/ 0 304063
3 Significant New Features for Oracle EDM https://blogs.perficient.com/2021/10/13/3-significant-new-features-for-oracle-edm/ https://blogs.perficient.com/2021/10/13/3-significant-new-features-for-oracle-edm/#respond Thu, 14 Oct 2021 01:00:42 +0000 https://blogs.perficient.com/?p=299046

The October update for Oracle EDM (Enterprise Data Management) contains over 15 new features. Many of which are intended to enhance the experience for the non-technical user. Continue reading below as we highlight the three most significant that all EDM customers, or prospective customers, should know.

 

Subscriptions Actions Improved

Previously, subscriptions were not able to apply property updates in certain circumstances. For example, when a node’s parent did not match between a source and target viewpoint(s). This resulted in users needing to submit separate requests to update the various viewpoints for property updates. Beginning with October’s update, subscriptions will now apply property updates for the following situations:

  • Updates to Node level properties made with another structural action such as an Insert or Move which was not included in the Action filter
  • Updates to Node level properties made to a shared node in a hierarchy location not included in the Top Node filter but has another location which is included
  • Updates to Node level properties made in a list viewpoint or under a source parent which is different than the target parent

 

Extract Incremental Modifications

The ability to export only modifications to data has long been a trademark of a mature master data management application. This ability not only helps users in terms of data analysis, but also provides flexibility in integrations with downstream systems. Oracle EDM will now provide this capability by allowing for the comparison between two time periods within an extract. The time periods can be specified on the General tab of the Extract definition (see the following image).

Edm Extractgeneral1

 

Additionally, users can select the actions to be included in the Extract. The actions available to be included in the extract are shown below.

Edm Extractactions

 

Blockout Capabilities

It’s not uncommon for organizations to prevent updates to dimensions, hierarchies, reference lists, etc. during certain time periods. Oracle EDM now provides the ability to enforce this type of requirement systematically. When the Blockout option is enabled, requests are prevented from being Completed. However, requests can still occur and move through the various stages (i.e. create, submit, enrich, approve) – they are just not applied until the Blockout is lifted.

Configuration of the Blockout is applied by specifying a start & end date and time (please see following image).

Edm Blockout

 

Are you interested in learning more about Perficient’s expertise in Oracle EPM offerings, including EDM? Contact us today!

 

]]>
https://blogs.perficient.com/2021/10/13/3-significant-new-features-for-oracle-edm/feed/ 0 299046
3 Key September Updates for Oracle EPM https://blogs.perficient.com/2021/09/27/3-key-september-updates-for-oracle-epm/ https://blogs.perficient.com/2021/09/27/3-key-september-updates-for-oracle-epm/#respond Tue, 28 Sep 2021 01:00:38 +0000 https://blogs.perficient.com/?p=297855

Oracle’s EPM September release contains three key updates. These three updates impact Smart View users, customers using Data Management, and those using journals within Financial Consolidation and Close. Continue reading below for explanations of each!

 

Smart View – Drill to All Levels in Base

A new option named “Drill to All Levels in Base” within Application Settings permits users to drill down beyond shared members; essentially this allows for drilling into the base hierarchy members beneath the shared member. This is a great enhancement for users who frequently need to drill into the details of shared members. This new option is available for Financial Consolidation and Close, Planning, Planning Modules, and Tax Reporting applications.

Please keep in mind that in order to use the new feature, the Smart View Ad Hoc Behavior option must be set to Standard. Additionally, the multi-member zoom feature is not supported when using the new option.

Enabling the option is simple:

  1. Within the application, click Application and then Settings.
  2. In Allow Drill Down on Shared Members in Ad Hoc, select Yes.
  3. Select the Drill to All Levels in Base check box.
  4. Click Save.

 

Data Management – Export & Import Capabilities Expanded

Customers using Data Management within Account Reconciliation, Financial Consolidation and Close, Planning, Planning Modules, Tax Reporting, Profitability and Cost Management can now easily backup setup and historical data. The new Export/Import option provide the ability to:

  • Backup historical data within the tables
  • Migrate imported data to test a target instance
  • Include the staging data in their daily backup procedures

The export process performs an export of all data tables to a .CSV file. The data is exported by POV and zipped. The export also includes all setup and data artifacts within Data Management:

  • Application
  • Source System
  • Period Mapping
  • Category Mapping
  • Logic Group
  • Check group
  • Integration
    • Location
    • Import Format (Dimension Maps)
    • Data Rule
    • Data Load Mapping (Member Maps)
  • Batch Definition
  • System Setting
  • Application Setting
  • User Setting
  • Security Setting

The export supports: All, Setup only, All – incremental, and incremental modes. The incremental mode process exports only new or changed data based on the POV since the last snapshot was exported. The output is stored in outbox/<filename>.zip.

The import process resembles a “restore” type of process — it does not perform any merge operation. The existing data is first cleared in the target application, and then data is imported from the files. Please note the process requires the filename with the full path of the Data Management root folder (for example: inbox/<filename>.zip, or inbox/mybackup/<filename>.zip). If no path is specified, the process assumes the file is in the Data Management root folder.

The new processes can be found in System Maintenance Tasks:

Epm Datamgmt Impexp

More information on the export/import options can be found here.

 

Financial Consolidation and Close Journal Audit Enhancement

Tracking who inputs data into financial applications is key best practice for all organizations to follow. Oracle’s FCC offering aligns directly with this concept, regardless of changes to the user’s role/username. Creation, submission, approval and posting are all journal activities which now will not lose the username information when the “Preserve Journal User Names” option is enabled.

More information on how to enable this feature can be found here.

]]>
https://blogs.perficient.com/2021/09/27/3-key-september-updates-for-oracle-epm/feed/ 0 297855