Celeste Qian, Author at Perficient Blogs https://blogs.perficient.com/author/cqian/ Expert Digital Insights Mon, 29 Apr 2019 19:43:38 +0000 en-US hourly 1 https://blogs.perficient.com/files/favicon-194x194-1-150x150.png Celeste Qian, Author at Perficient Blogs https://blogs.perficient.com/author/cqian/ 32 32 30508587 Guided Workflow for Complex Planning and Approval Processes at #SplashNOLA https://blogs.perficient.com/2019/04/29/guided-workflow-for-complex-planning-and-approval-processes-at-splashnola/ https://blogs.perficient.com/2019/04/29/guided-workflow-for-complex-planning-and-approval-processes-at-splashnola/#respond Mon, 29 Apr 2019 14:39:21 +0000 https://blogs.perficient.com/?p=239024

OneStream Splash is just around the corner! We are looking forward to four days of best practices, product updates, networking and hands-on workshops with an estimated 1,200 finance and industry experts during May 14-17 in New Orleans.

I am proud to present at this year’s conference alongside our customer, Herbalife. Don’t miss our breakout session, “Guided Workflow for Complex Planning and Approval Processes ” where we will discuss how Guided Workflow can make your work life easier.

In this session you will learn:

  • Best practice to setup workflow Security with multiple layers of the approval process
  • Best practice to setup complex workflow by utilizing different workflow profile types and workflow name
  • Design Guided Workflow Process for what-if analysis
  • Design Guided Workflow Process for top-side adjustment

Come see first hand a demonstration of the Herbalife Hyperion Workflow Process Overview, along with lessons learned, next steps, and more. Add our session to your agenda today!

Guided Workflow for Complex Planning and Approval Processes
Thursday, May 16 | 4:10 PM – 5:00 PM
Presenters: Neal Yeung & Celeste Qian
Topic: Workflow

Perficient is proud to be a gold sponsor this year at OneStream’s annual User Conference and Partner Summit. Visit us at booth #303 to meet with subject matter experts and thought leaders and learn how we’ve leveraged our extensive expertise in Corporate Performance Management to drive digital transformation for our customers.

Learn more about the other Perficient sessions at this year’s conference. 

 

]]>
https://blogs.perficient.com/2019/04/29/guided-workflow-for-complex-planning-and-approval-processes-at-splashnola/feed/ 0 239024
Working with OneStream XF Workflow Engine https://blogs.perficient.com/2018/09/24/working-with-onestream-xf-workflow-engine/ https://blogs.perficient.com/2018/09/24/working-with-onestream-xf-workflow-engine/#respond Mon, 24 Sep 2018 19:17:25 +0000 https://blogs.perficient.com/?p=231610

The OneStream XF Workflow Engine is one of my favorite design components in the platform. It leads the users to a centralized working panel and protects them from the complex technical concepts of the system through the entire work cycle.

First of all, the workflow tasks are not limited to having users input and review data, or execute business rules. It also coordinates other tasks such as data governance and data integration. Users doesn’t need to navigate back and forth to different modules or platforms to perform various tasks. From the OnePlace tab, users can accomplish a lot.

The OneStream XF workflows are configured from workflow profiles which are basic building blocks of the workflow structure. For example, the following workflow has three profiles. The user will perform the data import task that defined by the system admin to import, validate and load the Jan-2014 actual data of Accessories. Then review or input data through data forms and create journal entries.

The required steps for each type of task can be customized as the following examples show:

Import can be configured in the following ways:

  • Import -> Validate -> Load as the above example shows.
  • Import -> Validate -> Load -> Certify
  • Import -> Validate -> Process & Certify
  • Import -> Validate -> Process -> Confirm -> Certify

Forms can be configured in the following ways:

  • Form Input as the above example shows.
  • Form Input -> Certify
  • Form Input -> Process -> Certify
  • Form Input -> Process -> Confirm -> Certify

Journals can be configured in the following ways:

  • Journal Input
  • Journal Input -> Certify
  • Journal Input -> Process & Certify
  • Journal Input -> Process -> Confirm -> Certify

For more details on each of these task types, see OneStream’s OnePlace Workflow of the XF Reference Guide.

Second, basic input workflow profiles can  be created from workflow templates if creating many workflow profiles. It’s quick and easy and can save a lot of time from both implantation and maintenance perspectives. All the data import configurations such as data source, transformation business rules, forms, and certifications will be applied to the profiles created from the template. New changes of the template won’t be applied to existing base input workflow profiles built from that template automatically, but you can push the changes to the workflow.

Last but not least, workflows are associated with scenarios and entities, which is easier for system administrators from an auditing standpoint and more straightforward for users who have multiple workflow assigned.

To audit workflow status, right click on the workflow and select “Status & Assign Entities.” All the workflow status statistics can be viewed such as last activity time, assigned entities, dependents, complete percent, and not started task percent, etc. It helps system administrators manage and track workflow tasks and multi-period processes.

 

More demos and advanced content can be found on the OneStream Academy site. It provides up-to-date learning materials and instructions of both OneStream XF and XF MarketPlace Solutions.

In my next blog, I will introduce OneStream deployment options and XF MarketPlace Solutions.

 

 

 

]]>
https://blogs.perficient.com/2018/09/24/working-with-onestream-xf-workflow-engine/feed/ 0 231610
Comparing Data Integration between OneStream and Oracle EPM Cloud https://blogs.perficient.com/2018/09/18/comparing-data-integration-between-onestream-and-oracle-epm-cloud/ https://blogs.perficient.com/2018/09/18/comparing-data-integration-between-onestream-and-oracle-epm-cloud/#respond Tue, 18 Sep 2018 14:44:40 +0000 https://blogs.perficient.com/?p=231376

Everyone in the industry is talking about the differences between Oracle Enterprise Performance Management Cloud and OneStream XF – the latter was created by the inventors of Oracle Hyperion Financial Management and Oracle Hyperion Financial Data Quality Management (formerly Upstream Software). The blog post will address the data integration options of each solution.

Oracle Enterprise  Data Management Cloud

Oracle Enterprise Data Management Cloud currently supports direct connection with Oracle HCM Cloud, Oracle Financials Cloud, and NetSuite. Any other external source systems can be integrated via flat file type and drill back to the source system through a defined URL.

It also supports data integration between each Oracle EPM Cloud products such as Oracle Enterprise Planning and Budgeting Cloud Service (EPBCS), Oracle Financial Consolidation and Close Cloud Service (FCCS) and Oracle Account Reconciliation Cloud Service (ARCS).

The on-premises solution, Financial Data Quality Management Enterprise Edition, supports more external source system types including Oracle E-Business Suite, PeopleSoft, Fusion Applications, SAP and JD Edwards, etc.

If customers that implemented Oracle EPM Cloud have requirements to establish a direct connection with external data sources that are not supported by Data Management Cloud, implementing on-prem FDMEE could be an option.

 

OneStream XF

OneStream XF supports four kinds of data sources: Fixed Width File, Delimited File, Connector or Data Management Export Sequence.

A Fixed Width File could be a fixed format financial report spooled out to a text file. It’s not necessary that everything is in a proper column as a delimited data file.

Data Management Export Sequence can export data from one data unit/cube to another through workflow within the OneStream XF application.

Connector could be configured to directly connect to external source systems through common methods or through web services. Connector business rule is written by VB.Net to establish connection to the source system, build data result sets, and drill-back to external source systems. It can be configured with Database Provider (ex: SQL Server, Oracle DB), Web Services (REST, SOAP) and OneStream Data Blending.

Customers have options to implement cloud based or on-premises OneStream XF application and configure a direct connection to other cloud or on-premises system such as ERP Cloud, sub systems, data warehouse, and HR systems, etc. For instance, if you would like to see SAP sub ledger data, you can build a connector business rule to establish a direct connection with drill back to the transactions in SAP with both OneStream XF deployment methods.

Image source: https://videos.onestreamsoftware.com/comparing-onestream-xf-to-oracle-planning-and

From a navigation and workflow perspective, OneStream XF is different from Oracle EPM though the data load steps are similar: import and transform, validate, export/load.

The data load process in OneStream XF is embedded in the workflow.

 

 

 

While the data load process in Oracle EPM should be performed through Navigator -> Data Management

In my next blog, I will compare the workflow between OneStream XF and Oracle EPM Cloud.

 

]]>
https://blogs.perficient.com/2018/09/18/comparing-data-integration-between-onestream-and-oracle-epm-cloud/feed/ 0 231376
Using EPM File Transfer Utility and Automate Utility https://blogs.perficient.com/2015/05/06/using-epm-file-transfer-utility-and-automate-utility/ https://blogs.perficient.com/2015/05/06/using-epm-file-transfer-utility-and-automate-utility/#comments Wed, 06 May 2015 18:45:36 +0000 https://blogs.perficient.com/oracle/?p=2834

epm1

Some of the most common questions we get from customers considering a move to PBCS is how interfaces will work.  For example:

  • Can the Cloud Service be deployed with another on-premises application?
  • How can the Oracle PBCS interfaces be automated?

The answers to these questions lie within the EPM File Transfer Utility and the EPM Automate Utility.

The EPM File Transfer Utility and EPM Automate Utility use secure HTTP connections to communicate with the Oracle Planning and Budgeting Cloud Service instance. The EPM File Transfer Utility is an efficient tool to migrate artifacts between on-premises environment and Cloud Service environment without using the browser. The EPM Automate Utility enables Service Administrators to remotely perform administrative tasks. Administrators can create scripts that are capable of completing a wide array of tasks and automate their execution using Windows Scheduler.

In this blog, I will walk through the basic features and usage of the EPM File Transfer Utility and the EPM Automate Utility.

EPM File Transfer Utility:

  • Install the File Transfer Utility:

Log into PBCS Workspace and navigate to Tools, then Install, and select File Transfer Utility.

epm2

 

Click Save File to save the EPMCopy.exe file locally.

epm3

 

  • The File Transfer Utility parameters are:

USERNAME: An Oracle PBCS user name with Service Administrator provision.

IDENTITY_DOMAIN: The identity domain of the service administrator.

FROM: A location within Oracle PBCS or a local/network drive.

                      Examples:

                             To migrate files from Application Management(LCM):

                                  https://test-cloud.pbcs.us1.oraclecloud.com/files/lcm/filename

                            To migrate files from Data Management (FDMEE):

                                https://test-cloud.pbcs.us1.oraclecloud.com/files/inbox/filename

TO: A location within Oracle PBCS or a local/network drive.

LOADDATA: This parameter instructs the Data Management (FDMEE) to execute an existing data load rule.

  • Rulename: Name of the data load rule to be executed.
  • Startperiod: Start period name.
  • Endperiod: End period name.
  • Importmode: Two import modes are supported: APPEND and REPLACE. With APPEND mode, data will be added to the target period(s). With REPLACE mode, data will be replaced in the target period(s).
  • Exportmode: Four export modes are supported: STORE_DATA, ADD_DATA, SUBTRACT_DATA, and OVERRIDE_ALL_DATA.

 

 

  • Prerequisites and Notes for Using the LOADDATA Parameter
    • Data load rule, startperiod and endperiod mappings must exist in the Data Management.
    • The data load rule shouldn’t be associated with a data file in Data Management otherwise the uploaded file will only be uploaded into the Inbox folder and the associated data file will be exported to the Planning application.
    • The File Transfer Utility does not use translated values. Parameters and their values must be entered in ASCII characters.
    • Importmode and Exportmode values must be specified in uppercase letters.
    • Values of FROM and TO should not contain spaces while the name of the data load rules may contain spaces.

 

  • File Transfer Example:  The following example will upload data file 201502.txt into the Data Management Inbox folder and export the data into the Planning Application
    • Open a command prompt and navigate to the directory of the EPM File Transfer Utility.
    • Enter the values of USERNAME, PASSWORD, IDENTITY_DOMAIN, FROM, TO and LOADDATA.
    • Data load status can be checked in the Data Management (FDMEE).

epm5epm6

26

 

 

EPM Automate Utility

The EPM Automate Utility enables Administrators to complete all tasks, except executing a data load rule using the DATALOAD command and uploading data files into the Inbox folder of Data Management. For those tasks, use the EPM File Transfer Utility.

  • Prerequisites:
    • Planning jobs are required in most of the EPM Automate Utility commands. Administrators must create appropriate jobs to perform the following EPM Automate Utility operations:
      • Import/Export data into/from the Planning Application
      • Migrate data between a block storage database and an aggregate storage database
    • Business Rules to be executed must exists in the Planning application.

 

  • Supported Administrative Tasks:
    • Import and export metadata
    • Import and export Planning application data
    • Refresh the Database
    • Launch Business Rule(s)
    • Copy data from one database to another database
    • Upload files into the Oracle PBCS repository
    • Download/Delete files from Oracle PBCS repository
    • Export and import application and artifact snapshots using Application Management
    • List the files in the Oracle PBCS repository

 

  • Example1: The following example shows the steps for exporting Metadata into/from a Planning application.

Create a job to export the Account dimension:

Log into the PBCS Simplified Interface and navigate to Console and click Export.

13

Click Create to create a new job.

14

Select the target dimension and choose the Planning Outbox option, then click Save as Job. Enter the Job name (e.g. “ExportAccountDim”).

15

 

16

Install the EPM Automate Utility.

17

Click Start, then All Programs, then EPM Automate.Navigate to a specific directory. (Optional).

Sign in to the EPM Automate Utility:

epm7

Export the metadata of the Account dimension as Account.zip using job ExportAccountDim.

18
Use command “listfiles” and “downloadfile” to list the content of the PBCS repository and download data files to local machine.

19

 

  • Example2: The following example shows the steps for importing a metadata file to the Planning Application. The assumption has been made that a CSV file labeled Account.csv has been created.

Create a job to import the Account dimension.

Log into the PBCS Simplified Interface and navigate to Console and click Import.

20

 

Click create to create a new job.

21

Enter the CSV file name and select the Planning Inbox folder option and then click Save as Job. Enter Job name “ImportAccountDim”.2223

 

Upload the metadata file Account.csv into the PBCS repository and export it into the Planning application This actionupdates the layout of the Account dimension using job ImportAccountDim.

24

Job Status can be checked in the Job Console in PBCS.

]]>
https://blogs.perficient.com/2015/05/06/using-epm-file-transfer-utility-and-automate-utility/feed/ 1 205680
Migrating On-Premises Planning Application to PBCS https://blogs.perficient.com/2015/02/19/migrating-on-premises-planning-application-to-pbcs/ https://blogs.perficient.com/2015/02/19/migrating-on-premises-planning-application-to-pbcs/#respond Thu, 19 Feb 2015 21:27:41 +0000 https://blogs.perficient.com/oracle/?p=2318

PBCS_Migration

 

Are you currently thinking about migrating your existing on-premises Hyperion Planning application to the Cloud? Are you having reservations about what this process may entail? Well put your reservations aside! Oracle Planning and Budgeting Cloud Service (PBCS) makes the migration easy as it essentially has the same code base as on-premises Hyperion Planning release 11.1.2.3. In this blog I will give a brief introduction to the basic steps of using the PBCS Application Management (LCM) to migrate an on-premises Planning application to the Oracle Planning and Budgeting Cloud Service.

Prerequisites:

  1. The first release of PBCS supports an upgrade path from R 11.1.2.1 directly into a Cloud based application. The immediate roadmap includes adding support for upgrades from R 11.1.2.2 and then R 11.1.2.3 to PBCS.
  2. Ensure that the current on-premises Planning application is stable and without any cube refresh errors or invalid rules.
  3. The Service Administrator role is required in PBCS in order to perform the migration.
  4. The LCM Administrator role and Administrator role are required in the on-premises EPM system.
  5. The target application name in PBCS should be same as the on-premises application name.
  6. Ensure that no application currently exists in PBCS.

Artifacts Not Supported:

  1. Shared Service custom roles
  2. Reporting and Analysis Annotation and Batch Jobs
  3. Essbase global substitution variables. Global substitution variables have to be converted into application-specific variables before migration
  4. Workspace Pages and Personal Pages
  5. Essbase report scripts and rules (rul.) files

 

Migration Steps:

  • Migrate the Security Model:
    • Identify on-premises EPM System users and groups. Generate a provisioning report that contains the information of users and groups provisioned for the on-premises Planning application that is being migrated.4
    • Use the provisioning report to identify users who should be allowed access to the service. Use the text editor to create a comma-separated user upload file, users.csv, with the following format: 11
    • Log into the on-premises Planning application Shared Services as the Administrator and export the Native Directory as groups.csv.
    • Use a text editor to open the groups.csv file and delete information for groups that are not used to control access to Foundation Services, Planning, Enterprise Resource Planning Integrator, and Reporting and Analysis artifacts.
    • Add information pertaining to the external groups that are used to grant access to Oracle Hyperion Foundation Services, Planning, Enterprise Resource Planning Integrator, and Reporting and Analysis artifacts.
    • Use a text editor to create comma-separated files for those users being assigned to each PBCS role: viewers.csv, planners.csv, powerusers.csv, admins.csv.
    • Log into Oracle Cloud My Services and import the following files: users.csv, viewers.csv, planners.csv, powerusers.csv and admins.csv.15
    • Log into Oracle Planning and Budgeting Cloud Service Application Management to import the groups.csv file.6
  • Export Artifacts from the On-Premise Deployment
    • Launch the Shared Service Console in the on-premises deployment and export the following artifacts:
      • Foundation – Calculation Manager
      • Planning – Planning application except Global Artifacts – Report Mappings7
      • Reporting and Analysis – Repository Objects:
        • All Financial Reporting objects associated with the Planning application (Snapshot Report and Snapshot Book do not need to be associated with an application)
        • Any third-party content
        • HRInternal – DataSources
        • UserPOVs for the user that were migrated as part of the security model migration in HRInternal – UserPOV
      • Security
    • Define the migration and specify the folders to the data set. Clear Export with the Job Output export option and execute the migration.NOTE: The PBCS File Transfer Utility supports migrating artifacts between an on-premises environment and a PBCS environment without using the browser.
  • Export Data (Optional)
    • Export data from the EAS console as needed (R 11.1.2.1.x). Use administration services to perform the export.
  • Zip the Exported Artifacts and Upload the Zip Files to the Planning and Budgeting Cloud Service Workspace
    • Navigate to middleware_home/user_projects/epmsystem1/import_export/admin@native directory on the Foundation Service machine in the on-premised deployment.
    • Right-click the export folder and select your zip file equivalent and then select Add to Archive.
    • In the Add to Archive dialog box, right-click the selected folders and set the following information:
      • Change the name of the archive to OnPremisesApplications.
      • In the Archive Format field, select Zip.
      • In the Parameters field, enter cu=on.8
    • Log into PBCS and navigate to Administer, and then Application Management. Right-click on Application Snapshots and select Upload. Browse to the folder containing the zip file and click Finish.9
  • Import Artifacts to Oracle Planning and Budgeting Cloud Service:
    • In PBCS, navigate to Administer, then Application and expand Application Snapshots.
    • Import all products and artifacts in the following order: Reporting and Analysis, Planning, Calculation Manager.
    • If the Migration Status Report shows an error that the Exchange Rate artifacts failed to import, re-import Global Artifacts – Exchange Rate.
  • Manually Migrate Essbase Data to Oracle Planning and Budgeting Cloud Service (R 11.1.2.1.x)
  • Manually Define Business Rules Security if the Calculation Manager Module was used in release 11.1.2.1.x:
    • Start the Administration Service Console and expand Business Rules, then Repository View, and then Rules.
    • Use the following table to map the access privilege for each business rule.10
    • Log into PBCS and navigate to Administration, then Manage, and then Business Rule Security.
    • Select each business rule that was migrated and manually assign its user/group privileges.
  • Check the Migration Status Reports and Validate the Application.
]]>
https://blogs.perficient.com/2015/02/19/migrating-on-premises-planning-application-to-pbcs/feed/ 0 205660
What’s New for Oracle Hyperion DRM 11.1.2.4 – PBCS Integration https://blogs.perficient.com/2015/02/12/whats-new-for-oracle-hyperion-drm-11-1-2-4-pbcs-integration/ https://blogs.perficient.com/2015/02/12/whats-new-for-oracle-hyperion-drm-11-1-2-4-pbcs-integration/#comments Thu, 12 Feb 2015 16:18:14 +0000 https://blogs.perficient.com/oracle/?p=2298

With the recent release of EPM suite 11.1.2.4, it’s no big surprise that Data Relationship Management (DRM) becomes an integration tool used with Planning and Budget Cloud Service (PBCS) as Oracle continues to push cloud computing. Hierarchies, nodes and properties can now be shared between DRM and PBCS by using Web interface or the EPM Automate Utility which enables users to perform import/export tasks remotely.

Some of my clients who have deployed their planning system on PBCS utilized the EPM Automate Utility to automate common administrative tasks. Focusing on master data management, I am excited to see the new release of DRM that enables both on-premise applications, and cloud based applications, shared in the DRM dimensionality.

To perform the DRM-Cloud practice in an automated fashion, planning jobs must be established before using EPM Automate Utility commands; This is a one-time activity performed by administrators. Jobs are actions, such as metadata importing/exporting, data importing/exporting, launching business rules and refreshing database. Jobs can be started immediately or scheduled periodically.

The following example shows the steps required to import metadata from DRM to PBCS. The assumption has been made that a file labeled Account.zip has already been created and in the format.

NOTE: The Account.zip file must be located in the same directory as the upload command.

When populating DRM from a PBCS application, hierarchies, nodes and properties are imported/exported via a .CSV format. The Upload command, uploads the file to the inbox/outbox before the job is executed.

Blog1

NOTE:

  • Job names are case-sensitive.
  • Double quotes are required in the file name if the file name contains spaces.

The PBCS Simplified Interface is an easy approach to perform DRM-Cloud practices in a web interface fashion, as well as create administrative tasks.

Metadata importing/exporting jobs can also be created using the centralized administration console as indicated in the picture below.

blog1-2

Jobs can be run immediately, or scheduled periodically, from the administration console. For more details, please head over to Using PBCS Simplified Interface.

Blog1-3

In my next blog, I will discuss the enhancements to the Data Relationship Governance (DRG) module in the 11.1.2.4 release. Thank you!

]]>
https://blogs.perficient.com/2015/02/12/whats-new-for-oracle-hyperion-drm-11-1-2-4-pbcs-integration/feed/ 1 205659