Nesajosbenny Britto, Author at Perficient Blogs https://blogs.perficient.com/author/nbritto/ Expert Digital Insights Mon, 13 Jun 2022 16:12:09 +0000 en-US hourly 1 https://blogs.perficient.com/files/favicon-194x194-1-150x150.png Nesajosbenny Britto, Author at Perficient Blogs https://blogs.perficient.com/author/nbritto/ 32 32 30508587 Developing Custom Report in ARCS https://blogs.perficient.com/2022/06/12/developing-custom-report-in-arcs/ https://blogs.perficient.com/2022/06/12/developing-custom-report-in-arcs/#respond Mon, 13 Jun 2022 03:10:15 +0000 https://blogs.perficient.com/?p=310911

Currently within the Transaction Matching component there is not a report or view that would show the actual matched transactions within a given match type. The only output users could view were reports displaying that a transaction matched and its corresponding matched ID. To provide visibility into what two transactions had been matched we created a customized report that could be exported to Excel. This export showed the matched transactions as well as any unmatched transactions. This view became extremely useful for users as they could also use this export to support their reconciliations.

The illustration below is a segment of the report output to Excel. Col AE shows the Match ID (col highlighted in blue), and rows that display a value (e.g., row 23) show the transaction that was matched from the two sources. In the example below, row 23 displays the detail from Source One that matched with the detail from Source Two. If there is no Match ID displayed, then the transaction has yet to be matched.

Pic1

 

The process to create the custom report is two-fold. First a query must be created to generate the values, then a report is created referencing the query.

 

To create the query:

  1. From the Home page, navigate to Application > Report Configuration.
  2. Click the + icon to add a new query.
  3. Enter the query Name and an optional Description.
  4. From the Type drop-down list, select Report Query.

 

 

Picture2

 

5. Click the Generate Query

The Generate Query dialog box displays.

6. From the Module drop-down list, select Transaction Matching.

7. From the Query drop-down list, select the applicable Transaction Matching data source to represent source one.

8. The Apply Security check box can be left unchecked. Click the Next

A list of available columns from the selected data source displays.

9. Select the columns to be displayed in the report. HINT: You can select all columns by clicking the Add All icon (>>) or select columns individually and add them to the Selected list by clicking the Add button (>).

10. Click the Next icon to select Filters.

11. If Filters are required, click the Create Condition button, and make the appropriate selections. Otherwise, click OK to return to the New Query

The query is generated automatically and displays in the Query section.

12. Click the Validate button to ensure the query is valid.

13. Copy the entire SQL query that was generated to Notepad to be used later.

14. Repeat steps 1-13 to create a query for the second data source.

15. From Notepad, copy the rows from the first query below the second query to create a third query. This is the query that will be used to build the report.

16. Make note of the Dynamic data source table TM_TRANS_<NUMBER>. In the example below the data source table for the first query is TM_TRANS_2001 TM_TRANS_2001EO. The data source table for the second query is TM_TRANS_2002 TM_TRANS_2002EO.

17. In the third query, create the outer join to join the two tables in a single query.

Below is a pseudo code for the query:

SELECT a.*  ,b.*

From TM_TRANS_3000 a,

TM_TRANS_4000 b

WHERE a.match_id = b.match_id(+)

AND { Additional conditions to filter the required data)

 

Picture3

 

18. Return to ARCS to build the third query.

19. Click the + icon to add a new query.

20. Enter the query Name and an optional Description.

21. From the Type drop-down list, select Report Query.

22. Do not click Generate Query, instead, paste the third query created in Notepad to the Query

23. Click the Validate button to check the query then click Save and Close.

 

To create the report:

  1. From the Home page, navigate to Application > Report Configuration.
  2. Select the Report tab from the bottom of the screen.
  3. Click the + sign to create a new report.

The New Report dialog box displays.

  1. Enter a Name.
  2. From the Query drop-down list, select the query you created in the previous steps. NOTE: After selecting the query, the Parameters section is populated with the query’s parameters.
  3. Browse for a saved template. NOTE: You will need to create a BI template with the columns from your query to be used for the report. https://www.youtube.com/watch?v=vuLEv0_EL4s
  4. From the Report Group drop-down list, select Transaction Matching. This will save the report to the existing Transaction Matching tab within Reports.
  5. From the Output Format drop-down list, select the applicable output format (e.g., xlsx).
  6. From the Parameters section, make any applicable changes to the Display Name, Parameter Type, etc.

Picture4

 

10. Next you must grant access to the report. Click the Access

11. Click the + icon to add access by Role.

12. From the Application Module drop-down list, select Transaction Matching.

13. From the Role drop-down list, select the applicable role level to grant access to the report. HINT: You will need to add a row for each Role type that should have access.

Picture5

 

14. Click Save and Close.

15. After the Report is saved successfully navigate to Navigator > Dashboards > Reports.

16. Run the Transaction Matching report created.

]]>
https://blogs.perficient.com/2022/06/12/developing-custom-report-in-arcs/feed/ 0 310911
EPBCS – Troubleshooting Security Issues in a Smarter Way https://blogs.perficient.com/2018/06/24/epbcs-troubleshooting-security-issues-in-a-smarter-way/ https://blogs.perficient.com/2018/06/24/epbcs-troubleshooting-security-issues-in-a-smarter-way/#respond Sun, 24 Jun 2018 23:21:13 +0000 https://blogs.perficient.com/?p=227743

EPBCS Security

Setting up EPBCS security sometimes may look simple but gets complicated when the number of users and level of security among the EPBCS users gets complex. This blog is not to detail out the security setups done in EPBCS but its more about giving some tricks to resolve the security issues in a faster and simpler way.

Types of Roles:

EPBCS provides 4 types of Function Roles to control the access. Service Administrator being the top-level access basically has full access on the environment. Viewer, User and Power User have different access as shown in the figure below

In General, Security groups are created based on various functional roles and users are assigned to those security groups based on their roles in the organization.

After Go-Live, Users may have access issues and some-times it’s difficult to diagnose the issue faced by user in one round of testing. Also client have SSO for users and getting passwords from user is not acceptable for testing. Either developers have to sit with the users or download the security file to understand the user issue. Below step explains the method to download the security file.

  1. Login to EPBCS Cloud environment.
  2. Navigate to Tools > Access Control.
  3. Click on Export. Downloads a security file in the local machine.
  4. Open the file in notepad or any text editor.

Security file downloaded explains the security group and users assigned to the security group. Below are some of the common issues raised during the testing.

  •  Unable to access the form or data in Smartview.
  •  User wants the same security as their co-worker.

Above scenarios are the most common issues raised during testing and to make the testing process quicker we created a VB Script (code not shared). VB Script basically reads the security file, copies the security of the user to be copied and re-creates the security file for Test User which can be uploaded back into EPBCS. 

Scenario 1: Unable to access the form or data in Smartview.

  1. Login to EPBCS Cloud environment.
  2. Navigate to Tools > Access Control.
  3. Click on Export. Downloads a security file in the local machine.
  4. Run the script to copy the security of the user user1 having issues to test user testuser using the vb script.
    • Security file before running the script
    • Security file after running the script
  5. Upload the security file back to EPBCS by navigating to Tools>Access Control. > Import.
  6. Connect to Smartview using the Test User.
  7. Test user should now have the similar issue reported by the user.
  8. Update the required changes to fix the issue for Test User.
  9. Download the security file again and run the VB script to copy the updated security of Test User to the user who reported the access issue.
  10. Upload the security file generated on step 9 to EPBCS by navigating to Tools>Access Control. > Import.
  11. Request the user to disconnect and re-test.

Scenario 2: User1 wants the same security as another user2

This is a common scenario where users may not be able to explain their security requirement in EPBCS terms but will be requesting to have similar security as there co-worker.

  1. Login to EPBCS Cloud environment.
  2. Navigate to Tools > Access Control.
  3. Click on Export. Downloads a security file in the local machine.
  4. Run the VB script to copy the security of the User1 to User2
  5. Upload the security file generated on step 4 to EPBCS by navigating to Tools>Access Control. > Import.
  6. Request the user to disconnect and re-connect to Smartview.

 

 

 

]]>
https://blogs.perficient.com/2018/06/24/epbcs-troubleshooting-security-issues-in-a-smarter-way/feed/ 0 227743
EPBCS – Performance Load Testing Using EPM Automate and HAR https://blogs.perficient.com/2018/05/21/epbcs-performance-load-testing-using-epm-automate-and-har/ https://blogs.perficient.com/2018/05/21/epbcs-performance-load-testing-using-epm-automate-and-har/#respond Mon, 21 May 2018 12:05:50 +0000 https://blogs.perficient.com/oracle/?p=11938

EPBCS Performance is another key area which we focus on our Planning Projects during UAT. Oracle recommends to use HAR files to track the steps executed by users in Smartview or business rules executed from planning data forms. Oracle recommends to install Fiddler tool to create these HAR Files. EPM Automate has a replay command which can be used to replicate the tasks recorded in the HAR Files. This blogs covers the basic steps to install Fiddler, create HAR files for user tasks and use the HAR files to do a performance testing.

Install Fiddler

Fiddler is a web debugging tool used for performance testing, Web Session Manipulation, HTTP/HTTPS activity recording and there are many other features. Our purpose of installing the Fiddler is to record the HTTP/HTTPS traffic between Hyperion user from local machine to the EPBCS Cloud application hosted by Oracle. To explain in simple terms, Fiddler records time taken for the Essbase read/write operation when a data is entered in the data form or business rule is invoked by the user. Oracle has given the steps to install fiddler in the Oracle Doc – How to Use Fiddler to capture HTTP Traffic (Doc ID 1281065.1)

What is HAR File?

HAR file is abbreviation of HTTP Archive which keeps track of all the communication between client (Hyperion User) and Server (Oracle EPBCS). HAR file uses JSON Format to record the activities carried out between client and server. Hyperion Admins can view the HAR file using any JSON editor or these files can be viewed graphically in Web. One of the tool which is freely available is http://www.softwareishard.com/har/viewer/. Drag and drop the HAR file to view it graphically.

Let’s not get sidetracked from the main idea of the blog. Oracle recommends to use HAR File to record the network traffic as EPM Automate always prefers JSON format for processing any request.

Creation of HAR Files

Before creating the HAR Files, Hyperion Admin needs to list out the task for which the HAR Files has to be created. Few tips based on our experience

  1. Tasks which are executed in parallel.
  2. Business rule which run for a longer time.
  3. User tasks which will be invoked when long-running business rules are executed in background.

Steps :

  1. Ensure all the settings are done based on the Oracle Doc ID 1281065.1
  2. Make sure the steps are identified for the tasks for which HAR Files has to be generated. Unnecessary steps may give wrong results.
  3. Open the Fiddler application.
  4. Clear all the sessions.
  5. Log into application from SmartView.
  6. Complete the task to be performed in SmartView.
  7. Go back to Fiddler.
  8. Clear all the Non-200’s.
  9. Select all the session which have the results as 200.
  10. Go to File> Export Sessions > Selected Sessions.
  11. Save the file as <<task1>>.har

After saving the HAR file, If required view the HAR file using http://www.softwareishard.com/har/viewer/ 
Repeat the above steps from 4-11 to create the other tasks which has to be executed in parallel.

Create Replay file in CSV

EPM Automate uses Replay command to execute the process captured in HAR Files. Create a csv file as shown below

Above file doesn’t require any explanation as the format is simple and clear. Point to be noted is openwfpform2.har are called multiple times to invoke the same task multiple times in parallel.

 Call Replay Command

Replay command uses the replay csv file to execute the task in the application.

Command format :

                          epmautomate replay <<replay_file_name.csv>> [duration=N] [trace=true]

                          N – (Optional) – Specifies the number of minutes the replay command needs to be executed.

                          trace – (Optional) – Set to create trace files in XML Format.

Replay command can be directly executed from command prompt by following the below steps:

  1. Open command prompt
  2. Move to directory <<EPM_Automation_Installed_Folder>>\bin
  3. Execute command
  4. epmautomate replay <<replay_file_name.csv>> [duration=N] [trace=true]
  5. Below screenshot shows the HAR files executed in parallel and lists out the action carried out in smart view.
  6. Execute the replay command multiple times to make sure the execution time is consistent.
  7. Below is a scenario which shows the same form invoked 5 times in parallel had a different execution time.

 

Mostly replay commands will help to analyze the load performance on the data forms when Business rules are running in background. Execution times recorded can be later converted to a graph which can help users to understand the performance of the tasks in new cloud environment. Below is a sample graph based on Users Vs Time Taken for a user task.

 

]]>
https://blogs.perficient.com/2018/05/21/epbcs-performance-load-testing-using-epm-automate-and-har/feed/ 0 205471
Integrate Oracle Database with EPBCS using Rest API https://blogs.perficient.com/2018/04/19/intergrating-oracle-database-with-epbcs-using-rest-api/ https://blogs.perficient.com/2018/04/19/intergrating-oracle-database-with-epbcs-using-rest-api/#respond Thu, 19 Apr 2018 20:33:40 +0000 https://blogs.perficient.com/oracle/?p=11855

EPBCS uses a Lighter version of FDMEE Data Management to load the external data into application. Most of the EPBCS Implementation use File method to load the data, Hyperion Admin need to request the data in a delimited format file with fixed columns. Sometimes the turn around time for getting the file may be longer than expected or the Admin may have to hold the execution of the EPBCS business rules for these source files. In this blog I am trying to use REST API + Groovy language to generate the data file and load the data into EPBCS Application. Using this method there are 2 benefits:

  • No more waiting time to get the source data file.
  • Changing the format of the file based on EPBCS import format  in Data Management is quicker.

Blog is written based on a scenario where GL Balances data is stored in a source system which runs on Oracle Database. GL Actual data needs to be derived from Source system in the format which is aligned with Import format defined in FDMEE and it has to be loaded into EPBCS using FDMEE data load rules.

Groovy

Groovy is a object oriented programming which helps in using the REST API Provided by Oracle for EPBCS. We also have EPM Automate which is a standalone batch commands that can be used to do many EPBCS tasks. But in this case we need to interact with External data source, so we have to depend on the scripting programs which support EPBCS REST API. Below are few high level steps:

  1. Create a readonly user for Source Oracle database.
  2. Write a Groovy Script which includes
    • Retrieves data from Source Database.
    • Generate Source File.
    • Uploads the file to Data management Folder
    • Run the data rule to load the data.

Readers can go thru some of my earlier blogs which gives detailed information about Groovy and its usage in invoking REST API.

REST API – Part 1

REST API – Part 2

Most of the code used in my earlier blogs are reused in this blog. Lets get into the blog, Groovy script can be divided into 4 sections.

Section 1: Import Libraries

  • Import all the required libraries.
  • java.sql.* and groovy.sql.sql are used to connect to the oracle database.
  • java.io.* is used to create a file and write the database content to the file.
  • org.json.JSONObject and groovy.json.JsonSlurper are used to invoke the EPBCS REST API.

Section 2: Generate the data file

  • create a data connection for the oracle server using Sql.newInstance. Groovy supports multiple databases like MySQL, Oracle, SQLServer and HSQLDB.
  • CRUD operations over source database are possible using java.sql libraries.
  • Make sure we use the read only db user for this groovy script.
  • create a file using File java class. File1 shown below is the file object which will be used to write the data retreived from Oracle.
  • Retreive the data and redirect the output to File1.

Section 3: Upload the file

  • REST API Upload is used to load the file but the API has to be called multiple times.
  • Data file to be loaded has to be divided into chunks and data file has to be loaded based on number of chunks.
  • Chunk size should not be greater than 50MB.
  • Upload API is used to upload snapshot and files to the data management folders.
  • Oracle has provided some sample scripts which can be used in developing the uploadFile method.
  • extDirPath is the parameter used specifically for files uploading to data management folders.

Section 4: Invoke Data Rule

  • Invoking data rule is similar to invoking a business rule.
  • Start Period and End Period are mandatory parameters which gives the Period range.
  • Import Mode – Mandatory parameter to specify the mode to be used to send the data in the data management
    • APPEND to add to the existing POV data in Data Management.
    • REPLACE to delete the POV data and replace it with the data from the file.
    • RECALCULATE to skip importing the data, but re-process the data with updated Mappings and Logic Accounts.
    • NONE to skip data import into data management staging table.
  • Export Mode – Mandatory parameter to specify the mode to be used to send the data from the data management to EPBCS.
    • STORE_DATA to merge the data in the Data Management staging table with the existing Planning data.
    • ADD_DATA to add the data in the Data Management staging table to Planning.
    • SUBTRACT_DATA to subtract the data in the Data Management staging table from existing Planning data.
    • REPLACE_DATA to clear the POV data and replace it with data in the Data Management staging table. The data is cleared for Scenario, Version, Year, Period, and Entity.
    • NONE to skip data export from Data Management to Planning
  • Invoking business rule can be referred from my earlier blogs listed above.

Readers need to have a good knowledge on Groovy + EPBCS REST API documentation provided by oracle to apply the method explained in this blog.

]]>
https://blogs.perficient.com/2018/04/19/intergrating-oracle-database-with-epbcs-using-rest-api/feed/ 0 206171
Automation of FDMEE Jobs using EPM Automate in EPBCS https://blogs.perficient.com/2018/03/26/troubleshooting-fdmee-jobs-using-epm-automate/ https://blogs.perficient.com/2018/03/26/troubleshooting-fdmee-jobs-using-epm-automate/#respond Tue, 27 Mar 2018 02:59:56 +0000 https://blogs.perficient.com/oracle/?p=11724

 

Introduction:

Oracle EPM cloud has set of EPM Automate commands which are used to automate administration process for EPBCS. Admins can write batch scripts on MS-Dos, PowerShell or other batch scripting program to call the EPM Automate commands. Batch scripts can also be scheduled using Task Scheduler in Windows. This blog is for the admins who have good understanding of scripting and EPM Automate commands.

Note :

Oracle has introduced a new EPM Automate command in March – Release to upgrade the existing epmautomate commands. Please make sure the EPM Automate commands are upgraded using the below command.

command : epmautomate upgrade

One of the tasks for EPBCS Admins is to load the data from the legacy system to their Oracle EPBCS Cloud. FDMEE jobs are defined to load the data files to the EPBCS application. FDMEE is very fast in loading data files into the Oracle cloud but if the job has to be run multiple times in a day then this becomes a big pain for Admins. This blog is to show how we can automate the process using the EPM Automate commands in EPBCS to launch a rule and to generate error records in a readable format for Admins.

Readers require good scripting knowledge on Batch scripting as EPM automate log file need to be scanned using scripts to derive the process_ID from the log. Process_ID is the unique identifier of the Data rule instance submitted in FDMEE and the log file is postfix with Process_ID. Below is the logical flow with some code snapshots.

Logical Flow :

  • Set the variables for inbox and error directory
  • Login to EPBCS application using EPM Automate.
  • Read the data file from the source directory.
  • Upload the data file to data management file folders using EPM Automate command.
  • Rename the file to ensure the file has been uploaded.
  • Submit a report job using command runDMReport. Steps to create a batch file to run the report and derive process_ID can be referred in one of my blog

  • Listfiles derives the list of files in the Data Management folders.
  • Derive the last report name from the outbox/reports data management folder.

  • Derive the process_id of the report submitted from the variable %lastline%.
  • Add 1 to the process_ID of the report job to get the process_id of the data rule which will be submitted for loading the GL data.
  • Submit the data rule which loads the file uploaded in the above step.
  • After the job gets over, Download the process log based on the process id derived in our earlier step.
  • Scan the log file for errors to see if there are any user specific errors. Common FDMEE error is 3303 which are encountered if there are any missing members in the EPBCS Application.
  • Push the file which has captured the error details to error log directory.
  • Delete the old error files so that admins can see always the latest error file.
  • Below screenshot shows the kick-out file where Account member A_10101 is not a valid member.

FDMEE errors related to 3303 are more common but there may be additional error like input data file or folder missing. Batch program has to make sure all the errors are captured properly as the expectation is to refer the error file instead of log file. Log files are big and may not be easy for EPBCS admins to understand the issue. Above error file format can be opened in XL for easy analysis purpose.

After the script is tested successfully. Task Scheduler can be used to schedule the batch file based on Admin requirement. There are many blogs which can help in setting up a task in Task Scheduler. One of the blog with clear steps can be viewed here.

 

 

]]>
https://blogs.perficient.com/2018/03/26/troubleshooting-fdmee-jobs-using-epm-automate/feed/ 0 206162
EPBCS – Rules usage report and adding data to suppressed rows https://blogs.perficient.com/2018/03/07/how-to-find-rules-usage-in-epbcs-pbcs-planning/ https://blogs.perficient.com/2018/03/07/how-to-find-rules-usage-in-epbcs-pbcs-planning/#respond Wed, 07 Mar 2018 14:33:24 +0000 https://blogs.perficient.com/oracle/?p=11484

Oracle is doing a great job in continuously upgrading the PBCS / EBPCS based applications. Being a EPBCS consultant I always follow their latest updates to make sure we use it in our implementation or support process. This Blog is specific to two recent update which is really useful

Below is the link for Oracle EPBCS latest update release :

http://www.oracle.com/webfolder/technetwork/tutorials/tutorial/cloud/pbcs/2018-pbcs-wn.htm#February_2018_Update

Update 1 : Running Rule usage report

Most of the time we have trouble finding the Rules and the association of rules to the artifacts like Forms, Menu, Rulesets and Task List.  Oracle EPBCS latest update has come out with an easy solution to get the association by running a system Rules Usage report. Report

How to run?

Blog is explained based on Simplified User Interface as workspace is no more supported by Oracle.

  1. Navigate to Navigator > Monitor and Explore > System Reports
  2. Select the tab Rules Details
  3. Select the RuleType.
  4. Rules name can be filtered in the Name Filter text box.
  5. Select the Format for the report.
  6. click on Create Report.

How to read?

Report displays the details of the rules selected before submitting the Create Report.

Basically Report can be split into 3 sections:

Rule Details

  • First 3 columns explains the rule name, rule type and Cube.

Primary Association

  • Shows the direct association of the rule with Form, Task, Menu Item and Ruleset.

Secondary Association

  • Shows the association of artifacts displayed on Primary association which will be either Task list or Menu.

Update 2 : Adding data to suppressed rows in the form

Users can now add data for the members which are suppressed in rows due to missing data. Rows and Columns are suppressed in data forms to make sure the size of the form is not big and it loads faster in web browsers and SmartView. Earlier planning versions, Users are provided with a menu which is attached to the form which internally invokes a business rule to create a row in the form. But after this release users can add new rows for the members which are suppressed. Below are the steps explained to use this functionality in data forms.

Steps to be followed:

  1. Navigate to Create and Manage > Forms
  2. Select the form.Open the form in Edit mode.
  3. Click on Layout .
  4. Make sure the rows are suppressed in the Grid Properties
  5. Click on the Rows and check the dimension for which drop-down has to be enabled. All the dimension selected in the row will be displayed.
  6. Save the form

Steps to Enter Data

  1. Navigate to the Data form.
  2. Click on the drop-down arrow on the row members.
  3. Select the suppressed member and enter the data.
  4. Submit the data.

Note :Please note the form should have un-suppressed data to use this functionality.

 

 

 

 

]]>
https://blogs.perficient.com/2018/03/07/how-to-find-rules-usage-in-epbcs-pbcs-planning/feed/ 0 206151
Invoke Business Rule Using EPBCS REST API in Groovy – Part 2 https://blogs.perficient.com/2018/02/19/invoke-business-rule-using-pbcs-rest-api-in-groovy-part-2/ https://blogs.perficient.com/2018/02/19/invoke-business-rule-using-pbcs-rest-api-in-groovy-part-2/#respond Mon, 19 Feb 2018 14:41:47 +0000 https://blogs.perficient.com/oracle/?p=11216

This is a continuation of my special blog series. You can read Part 1 here.

Groovy script is basically split into four sections. We previously covered a short description of the sections, so we are getting directly into the script here.

Code Section 1

This section imports the java libraries which help in formatting the payload and reading the JSON Response from server.

import org.json.JSONObject
import groovy.json.JsonSlurper

JSONObject class is used to use the put method which pushes the parameters to payload.
JsonSlurper is a class that helps to convert JSON data into groovy data.

Code Section 2

Setting up user Credentials and other information related to REST API.As per oracle below are the expected format

1. API Version – 3
2. username – domain.username:password
3. authentication – HTTP Requests require Basic Authentication “Basic ” + javax.xml.bind.DatatypeConverter.printBase64Binary

Code Section 3

This section defines the list of common functions which are already developed by Oracle. Readers should have basic understanding of Java language to understand the groovy functions. Below are the list of common functions which are provided by Oracle and used in our example.

 fetchResponse – method used to read the response and convert into lines.

 fetchJobStatusFromResponse – method used to read the job status of business rule submitted.

Below are the custom methods used specifically to call the business rule.

executeRequest
1. Setup the connection using the EPBCS URL.


2. Authenticate user with credentials.
3. Set request properties and Request Type as POST.
4. Invoke the url and read the status code.


5. Status code is expected to be 200 for GET and 201 for POST. If there are any other status codes then please refer to Oracle Error Codes.

executeBusRule
1. Format the URL string to be passed.


2. Split the parameters if there are more than 1 parameter.


Program explained in this blog has only 1 parameter but I have added the code to make sure the method supports No param or more than 1 parameters.
3. Call the executeRequest explained above.

Code Section 4

This is the last main section which actually calls the business rule with parameters. Below screenshot very clearly shows that Push_Data rule  is invoked with a parameter Entity.

 

Code can be reused with few tweaks in Section 2 and Section 4. Validate and run  the program in the groovy console to invoke the business rule. Additional logic for deriving the parameters can be added in Section 4. Below screenshot shows the email sent by EPBCS on successful completion of business rule.

]]>
https://blogs.perficient.com/2018/02/19/invoke-business-rule-using-pbcs-rest-api-in-groovy-part-2/feed/ 0 206140
Invoke Business Rule Using EPBCS REST API in Groovy – Part 1 https://blogs.perficient.com/2018/01/25/invoke-business-rule-using-pbcs-rest-api-in-groovy-part-1/ https://blogs.perficient.com/2018/01/25/invoke-business-rule-using-pbcs-rest-api-in-groovy-part-1/#respond Fri, 26 Jan 2018 02:56:08 +0000 https://blogs.perficient.com/oracle/?p=11135

EPBCS REST API

REST (Representational state transfer) API is a method to interact with the server using an HTTP request. The user sends an HTTP request and the server responds to the request in JSON Format. Before the REST API, the server used to send an xml response, which is processed on the client side, but now we use JSON, which is very lightweight compared to SOAP.

Step 1
Send the HTTP Request to EPBCS with Methods (GET,POST,PUT and DELETE) and Parameters.
For instance, if we want to execute a business rule then Method will be POST and Parameters will be the name of the business rule + parameters.
Note :
GET and POST are the common methods used by most of REST API.
PUT is used for Setting Daily Maintenance window
DELETE is specific to delete application snapshot.

Step 2
Receive a response in JSON format.
Note : Blog readers are expected to have basic knowledge of Java programming and understanding of REST API.

Oracle has provided REST API, which can be directly used to do administrative tasks in Hyperion Cloud. EPM Automate is a good example of Rest API where EPM Automate internally calls REST API to do the administrative tasks. EPM Automate are very easy to use but have limited functionality when it comes to automation scripts. There may be few REST API which will be specific to FCCS,ARCS or Other EPM Cloud products, but there is lot of REST API in common that helps in automation. REST API can be written in many programming languages but this blog is explained in Groovy language.

Groovy
Groovy is a programming language that supports OOPS Concept and runs in Java platform. Groovy is more flexible (in my opinion) in writing automation scripts when compared to batch scripts. For instance assume we need to push metadata from an Oracle database to Cloud then groovy will be the right fit for this job as it can interact with external datasources. We have a lot of tutorials and blogs on Groovy which can help readers to understand the groovy language. I would suggest readers to have a glance on Groovy Tutorial.

Download and Install
1. Let’s Downland & Install the Groovy to start our first sample program in Groovy.
2. Download the java-json.jar file(Available on internet). This jar file has the libraries for parsing the data sent by server in JSON.

My First Program
After the successful installation of Groovy. Open the groovy console to execute our first groovy program.
1. Go to Program Files and click on GroovyConsole

2. I am not a big fan of Hello World or Hello Grovy programs so please copy the below program in the editor window.
*********************************************************
import org.json.JSONObject;
import groovy.json.JsonSlurper;
def executePrint
myString = new JSONObject().put(“PARAM_NAME”, “Value to be passed”).toString();
println myString;
executePrint;
null

*********************************************************

3. Click on the Execute Button


Above program explains how we will be passing Parameters to REST API.After Successful completion of groovy installation and execution of the sample program, We can start building our groovy program to invoke a Business Rule with parameters.

For better understanding I have split the script into 4 sections. Also it helps the developers to use the first 3 section of the code without any change(except the credentials) and update the section 4 based on client requirement. Most of the code explained in section 3 is taken from Oracle Common Helper Functions for Groovy.

1. Java classes to be used to read JSON data.
2. Setting variables for user credentials and authentication.
3. Common helper functions for Groovy provided by Oracle.
4. Main script to set the parameters and call the Business Rule.

I know it’s a lot of information with some Home Work to do, My next blog is continuation of this blog where I will be explaining the 4 sections in detail and we will see a demo on how to call the REST API to invoke a Business Rule.

]]>
https://blogs.perficient.com/2018/01/25/invoke-business-rule-using-pbcs-rest-api-in-groovy-part-1/feed/ 0 206136
Updating Substitution variables using EPM Automate https://blogs.perficient.com/2018/01/15/updating-substitution-variables-using-epm-automate/ https://blogs.perficient.com/2018/01/15/updating-substitution-variables-using-epm-automate/#comments Mon, 15 Jan 2018 15:47:44 +0000 https://blogs.perficient.com/oracle/?p=11104

EPM Automate

EPM Automate is always my one of the favourite topic when it comes to blogging. I don’t see EPM Automate as just a bunch of scripts which reduces some manual task done by users/admins but it also helps in troubleshooting, running reports and much more. This is my 4th blog in EPM Automate and I still feel that we can achieve a lot with this Simple and Innovative tool of Oracle Hyperion. I have also shared some of my other blogs below in EPM Automate.

This blog is more focused on Updating Substitution variables using command setsubstvars. EPBCS/PBCS admins one of the responsibility is to update the substitution variables when the forecast cycle begins. We had instances where we had lot of sub variables which were supporting 3 plan types and admins had difficulty in understanding the sub variables introduced by developers. Below scenario is explained for PBCS application but same is applicable to any Oracle Hyperion Cloud instance.

Scenario

PBCS instance we are discussing had 3 plan types Finance, Workforce and Capex. All 3 plan types had common sub variables and some specific to the plan type. Admin has to update the sub variables before the Forecast cycle begins, If any sub variable is provided with a wrong value then it will cause issues as the business rules and forms are based on the Sub variables. Admin may not notice the issue but later user may find when they see wrong data showing up on the Data forms. After user finds the issue related to Sub Variables then its kind of re-doing all the tasks again and that may add some waiting time to users to start the forecast.

Considering the above scenarios we provided a solution using EPM Automate with setsubstvars command. We created 4 csv files which had list of all the substitution variables for next upcoming 4 forecast cycles.

Approach

1. Prepare a csv file for each Forecast.
2. Admin can view the csv file in excel and update if needed before running the EPM Automate script.
3. Script creates/updates all the required Sub Variables for forecast.

Benefits of this approach

1. Admin need not worry about manually setting up the variables for every Forecast in the application.
2. No human error.
3. New variables can be introduced in the forecast files.

Detailed Steps

This blog will shows to create 1 file but the same method can be repeated for other forecast files.

Step 1. Open a workbook in excel and update the file as shown below.

Step 2. Sheet should have 3 columns
Col1 – Application or plan type name
Col2 – Sub Variable name
Col3 – Value of the variable
Step 3. After updating all the sub variables save the file in csv format.
Step 4. Create a EPM Automate script to read the csv file and update the Sub Vars.
a) Enter the login and URL Details.

b) Set the variable for input file location.
c) Login to the application

d) Read the csv file to create/update the sub vars.
e) Call the setsubstvars command.

delim specifies the delimiter
token specifies the column number

f) logout from the application.
Step 5. Run the script to update the sub variables.

Step 6. Validate the sub variables in Application.

Below is the complete script which can be modified based on the client admins requirement.

*******************EPM Automate Script to update Sub Var *************************************

REM PBCS Url Details

SET url=https://fastforward-test-perficient.pbcs.us2.oraclecloud.com
SET domain=perficient
SET user= xxx.yyyy@perficient.com
SET password=xxxxx

REM Input file location.

SET Inputfile=C:\Projects\Blog\setvar.csv

REM Login to PBCS Cloud

CALL epmautomate login %user% %password% %url% %domain%

for /f “delims=, tokens=1,2,3” %%A in (‘type “%InputFile%”‘) do (

epmautomate setsubstvars %%A %%B=%%C
)

CALL epmautomate logout

*******************************************************************************************

Below are some of my other blogs where I have shared some sparkling ideas around EPM Automate.

Other Blogs on EPM Automate

How to Download FDMEE Report Using runDMReport in EPMAutomate
How EPM Automate Can Help in Troubleshooting Issues
Integrate EPM Automate with Hyperion Smartview in Cloud

]]>
https://blogs.perficient.com/2018/01/15/updating-substitution-variables-using-epm-automate/feed/ 1 206134
Dynamic Creation and Deletion of Members in PBCS/EPBCS https://blogs.perficient.com/2018/01/05/dynamic-creation-and-deletion-of-members-in-pbcs-epbcs/ https://blogs.perficient.com/2018/01/05/dynamic-creation-and-deletion-of-members-in-pbcs-epbcs/#respond Fri, 05 Jan 2018 20:31:12 +0000 https://blogs.perficient.com/oracle/?p=10966

Introduction

PBCS/EPBCS has the option of deleting and creating a member for Dimension without doing any refresh to the database. This feature was introduced later in on-Premise planning application and the same is available in PBCS/EPBCS. We had instances where user had to wait for sparse dimension members to get added by admin and then start entering the data but with this functionality user need not wait for Admin to complete the activity. This blog is based on a PBCS environment and should be the same for other environments, too.

Scenario

The users had a requirement to add members dynamically, as they don’t want to wait until the Admin adds the members. This blog explains the role of Planning users and Planning Administrators in adding and deleting the members.

2-Step Process for User
1. Dynamically add the members for Data Entry
2. Submit Data in the form.

4-Step Process for Admin
1. Create a new member in the appropriate parent.
2. Refresh the Cube.
2. Move the data from member created by user to the new member
3. Delete the member created by user.

Blog is based on a PBCS Demo application where we have a Sparse Dimension Project and user will be creating new project members dynamically from planning forms. We need to do few setups before enabling this option for user

Assumptions
Blog readers are expected to have the basic knowledge on creating Data Forms, Calculation Manager, Creating menus. Blog will be more focused on the Dynamic creation and deletion of Project member.

Setup

Setup 1: Create a new parent Unassigned in the Project dimension with Enable for Dynamic Children checked.

Setup 2: Set the Number of Possible Dynamic Children to 10.
Setup 3: Refresh the database.

Note
Refresh is required to make sure the PBCS creates the required placeholder for the dynamically created members. In PBCS we don’t have access to EAS Console but if you do similar settings in On-Premise then we can see the members created in Outline but they will not be shown in EPMA.

Business Rules
Rule 1 : AddNewProject
This rule is used by User to dynamically create a new project under the Parent Unassigned.
1. Navigate > Administer > Calculation Manager.
2. Goto Variable Designer
3. Create a variable {NewProject}.

Properties
Dimension – Projects
Limits – @Relative(“Unassigned”,0)
RTP Text – New Project
Dynamic Parent – “Unassigned”

4. Create a new Business Rule. Select the Begin node of the rule and check Create Dynamic Members
5. Click on Global Range tab. Select variable {NewProject} created in step 3.

6. Validate and save the Rule.

Rule 2: Move Project Data
This rule will be used by Admin to move the project data entered for dynamically created member under parent Unassigned to more appropriate Parent.

1. Navigate > Administer > Calculation Manager.
2. Goto Variable Designer
3. Create a variable {TargetProject}.
4. Create a new Business Rule.
5. Add a Copy Data node to copy the data from dynamically created member under Unassigned Parent to {TargetProject} member.

6. Validate and save the Rule.

Rule 3: RemoveProject
This rule will be used by Admin to delete the member created by user under Unassigned parent.

1. Navigate > Administer > Calculation Manager.
2. Goto Variable Designer
3. Create a variable RemoveProject.

Properties
Dimension – Projects
Limits – @Relative(“Unassigned”,0)
RTP Text – New Project
Dynamic Parent – “Unassigned”

4. Create a new Business Rule. Select the Begin node of the rule and check Delete Dynamic Members
5. Click on Global Range tab. Select variable {RemoveProject} created in step 3.

6. Validate and save the Rule.

Rule-set – Delete Project

Create a rule set combining the Rule 2 and Rule 3 so that both the rules can be executed in serial from the data form.

Data Forms
Form 1 : Add new project – User
Form created for user to enter create the new project dynamically and add the data. Add a menu which refers the business rule AddNewProject

Form 2: Delete Project – Admin
Form created for Admins to push the project data from Unassigned parent to the new parent and delete the member created by user.Add a menu which refers the business rule-set Delete Project.

After the above steps are completed and tested in test instance the admin needs to provide proper security to forms and Business rule.

Now coming back to objective of the blog, User will be creating a new project member Project1_Direct under the parent Unassigned

1. Open the data form Add new project – User
2. Call the business rule AddNewProject from menu.

3. New project member Project1_Direct gets created under Unassigned.

4. User can start entering the data in the form.

After user has completed the data. Admin will be doing the following task

1. Create the new project member Project1 under parent Direct.
2. Refresh the cube.
3. Open the data form Delete Project – Admin.
4. Execute the business rule-set

5. Project1_Direct gets deleted and the data moves to Project1 member.

Naming convention of dynamic member was MemberName_ParentName and it was used on purpose as it helps admins to make sure they know where the new member Project1 needs to be placed.

Some of the basic steps like assigning menus to the form and creating rule-set was not explained in detail on purpose as the objective of the blog was to explain the feature of PBCS/EPBCS dynamic creation and deletion of members.

Also we have an option Enable Notification in the Business rules which can be checked to send mails to the user or admin after the business rule gets completed. User/Admin need to make sure the email address is specified in the User preference.

]]>
https://blogs.perficient.com/2018/01/05/dynamic-creation-and-deletion-of-members-in-pbcs-epbcs/feed/ 0 206130
How to Download FDMEE Report Using runDMreport in EPM Automate https://blogs.perficient.com/2017/12/20/how-to-download-fdmee-report-using-rundmreport-in-epm-automate/ https://blogs.perficient.com/2017/12/20/how-to-download-fdmee-report-using-rundmreport-in-epm-automate/#comments Thu, 21 Dec 2017 04:21:03 +0000 https://blogs.perficient.com/oracle/?p=10838

Introduction
PBCS/EPBCS has some built-in FDMEE Reports that can be executed by the user. One of the most common reports are the Process Monitor Reports, which can be executed on-demand from Data Management > Other > Report Execution.

The FDMEE report output can be a PDF, XLSX, HTML, or Excel, and the same can be determined before running the report from data management.

The FDMEE Report output ran on-demand from the data management are not stored in the server. The EPM Automate utility provides a command that can be used to generate the FDMEE report and gets stored in /Outbox/reports/Process_Id.PDF.

EPM Automate has commands to download the file from /Outbox/reports but the issue is the file number is dynamically generated and the file name is process_id.PDF.

Consider a scenario in which the user wants to run a Data Rule and needs to know the status of the load without logging into the PBCS/EPBCS. EPM Automate has commands to run the data rule and download the status reports but to download the PDF file without knowing the latest file name will be little tricky to use the EPM Automate.

Can we override the Manual task of downloading the file? Considering the file name is generated dynamically? Yes!

Below is the list of EPM Automate commands to get the required task done:

1. rundatarule – Runs the FDMEE data rule
2. ListFiles – List the files in the FDMEE Folder
3. Downloadfile – Downloads the file from FDMEE Folder
4. runDMreport – Runs the FDMEE Reports.

Data management provides a easy option to generate the script for runDMReport.

1. Navigate to Data Management > Workflow > Report Execution>.
2. Select the request group as Process Monitor reports.
3. Select the report and click on Create Report Script .
4. Enter the category,Accounting Period and Location.
5. Click Ok

The EPM Automation script shown below can be directly used with a couple tweaks on the login credentials and FDMEE Folders. EPM Automation script can be split in four steps:

1. Run the data load rule
2. Run the FDMEE Data report
3. Generate the list of files and get the latest file.
4. Download the file.

***********************************************************************************************************************

SET url=https://fastforward-test-perficient.pbcs.us2.oraclecloud.com
SET domain=perficient
SET user= don.ford@perficient.com
SET password=xxxxxxxxxxxxx

REM Login into Test URL

CALL epmautomate login %user% %password% %url% %domain%

REM Execute the data load rule

CALL epmautomate rundatarule Budget_load Jan-18 Jan-18 REPLACE REPLACE_DATA export_entity.csv – Step 1

REM Execute the FDMEE process Monitor report

CALL EPMAUTOMATE runDMreport “Process Monitor (Cat, Per)” “Category=Budget” “Accounting Period=Jan-18” “Location=EPM” “Report Output Format=PDF” – Step 2

REM Delete old files

del C:\Projects\Blog\ListFiles.csv
del C:\Projects\Blog\l1.csv

REM List out the files

CALL EPMAUTOMATE listfiles > “C:\Projects\Blog\ListFiles.csv” – Step3

REM Remove the lines from csv file

findstr /i “outbox/reports/” “C:\Projects\Blog\ListFiles.csv” > “C:\Projects\Blog\l1.csv”

REM Get the last file generated by runDMReport

for /f “delims==” %%a in (C:\Projects\Blog\l1.csv) do set lastline=%%a

CALL epmautomate downloadfile %lastline% – Step 4

CALL epmautomate logout

***********************************************************************************************************************

Run the batch script in Admin mode to call the business rule and download the process monitor output of the file.

After successful completion of Batch script, go to Data Management > Process Details to make sure the data rule and FDMEE report run was successful.

The PDF file is downloaded in your local machine and can be opened using Adobe or any PDF reader. Below is the screenshot of PDF file:

]]>
https://blogs.perficient.com/2017/12/20/how-to-download-fdmee-report-using-rundmreport-in-epm-automate/feed/ 2 206125
How EPM Automate Can Help in Troubleshooting Issues https://blogs.perficient.com/2017/12/18/how-epm-automate-can-help-in-troubleshoot/ https://blogs.perficient.com/2017/12/18/how-epm-automate-can-help-in-troubleshoot/#respond Mon, 18 Dec 2017 15:50:20 +0000 https://blogs.perficient.com/oracle/?p=10826

Introduction

EPM Automate plays a vital role in automating many admin and user tasks in Oracle Hyperion Cloud. This blog post demonstrates how EPM Automate can help in troubleshooting the issues during support phase. Basically, this method is useful when we have more than one instance like Development and a SIT/UAT/Prod instance. The following explains the scenario and provides the complete automation script which can be directly used in similar instances.

Scenario

The following approach is based on the assumption that there is a development instance and a UAT Instance where a user reports any issues. Below is the entity dimension:

During UAT/Prod testing we may have issues reported by various users which may be data issue or Business rule issue. Consider a user who supports the America Entity to have some issue with a Business rule; then below are some of the operations that cannot be performed in the instance user reported:

1. Updating data directly in the forms.
2. Running any business rule as it may update the user data.
3. Taking a level zero backup and update the Development environment.

To elaborate further on point #3, we cannot just copy the level zero backup data from UAT/Prod to Development instance as it may affect other developers’ data.

How do we then resolve this problem that makes user comfortable?

We have two prerequisites to implement this solution. They will not be covered in extensive detail but well enough to make convey the idea behind this solution.

Below are the two prerequisites:

Prerequisite-1: Business Rule (User Reported Instance)

Create a business Rule which accepts Entity and Month as run-time prompt. The Business rule should try to extract the level zero data from instance user reported. Store the file in the Inbox FDMEE Folders.

**************************************************************************************************************************

SET DataExportOptions
{
DATAEXPORTCOLFORMAT ON;
DATAEXPORTDIMHEADER ON;
DATAEXPORTDRYRUN OFF;
DataExportRelationalFile OFF;
DataExportNonExistingBlocks OFF;
DataExportLevel ALL;
DATAEXPORTCOLHEADER “Period”;
DATAEXPORTOVERWRITEFILE ON;
DataExportDynamicCalc OFF;
DataExportDecimal 2;
};

FIX (@Relative(“Account”,0),{Entity},{Month},@Relative(“Scenario”,0),@Relative(“Version”,0),”FY18″,@Relative(“Projects”,0))
DATAEXPORT “File” “,” “/u03/inbox/inbox/EPM/export_entity.csv” “#”;
ENDFIX

**************************************************************************************************************************
Prerequisite-2: FDMEE Data Rule (Development Instance)

Create a data rule in FDMEE to load the data extract file to Development instance. Below are the steps to create the data rule:

1. Create an Import format based on the Extract file created in Prerequisite-1.

2. Create data mappings to map the source and target dimension.

3. Create a data rule to load the data into development instance.

After setting up the prerequisite now its the time to write the automation batch script. Below is the steps to be written in the Batch script

1. Enter the dimension information to be passed to Extract business Rule.
2. Run the business Rule to extract the data for specific Entity.
3. Download the data file from UAT/Prod Instance.
4. Upload the data file to development instance.
5. Run the data rule to load the data.

The following script is completely tested script and it works well in our PBCS environment. Also the above steps are marked in the comment section in the below EPM Automate Script for better understanding.

EPM Automate Script
**************************************************************************************************************************
REM SET variables for development and UAT Instance.

SET p_url=https://fastforward-perficient.pbcs.us2.oraclecloud.com
SET p_domain=perficient
SET p_user= don.ford@perficient.com
SET p_password=xxxxx

SET d_url=https://fastforward-test-perficient.pbcs.us2.oraclecloud.com
SET d_domain=perficient
SET d_user= don.ford@perficient.com
SET d_password=xxxxxx

CALL epmautomate login %p_user% %p_password% %p_url% %p_domain%

REM Get the parameters for the dimension – Step 1

SET /P Entity=Enter Entity:

SET /P Month=Enter Month:

REM Call the Extract data rule to get the data of the Entity for the specific month. – Step 2

CALL epmautomate runbusinessrule Extract Entity=%Entity% Month=%Month%

REM Download the file – Step 3

CALL epmautomate downloadfile inbox/EPM/export_entity.csv

CALL epmautomate logout

REM Login to Development environment

CALL epmautomate login %d_user% %d_password% %d_url% %d_domain%

REM Upload the file – Step 4

CALL epmautomate uploadfile export_entity.csv inbox/EPM

REM Call the Data load rule to copy the data to Development Environment – Step 5

CALL epmautomate rundatarule Budget_load %Month% %Month% REPLACE REPLACE_DATA export_entity.csv

CALL epmautomate logout
**************************************************************************************************************************

Above script may require few modification based on the Dimension and FDMEE Data folders. Script has to be saved as *.bat file in your local machine and run the script in Admin mode.

Steps to run the script

1. Save the EPM Automate script *.bat file.
2. Run the script in Admin mode.

3. Enter the Entity America and Month Jan.

4. Script gets completed successfully.

Data specific to the entity America gets copied to Development instance. The developer can start working on the issue the user is facing in UAT/Prod Instance.

Key Benefits

1. Developers can test the Business rule/Data Issue in development with similar data in which user reported the issue.
2. User can see the results before the changes are pushed to UAT/Prod.
3. No need to overwrite all level zero data in any instances.

]]>
https://blogs.perficient.com/2017/12/18/how-epm-automate-can-help-in-troubleshoot/feed/ 0 206123