Both OneStream and Oracle EPM cloud provide the data file batch functionality to load the data. The common ground is that they both need a specific file name format, others are different.
For OneStream, here are the steps:
- Create batch processing extender business rule. Batch file processing is executed by creating an Extender Business Rule that calls the OneStream API function Utilities.ExecuteFileHarvestBatch. This function also accepts switches which control the level of Workflow processing execution
- Create data management sequence. Batch file processing is executed by creating a Business Rule Data Management Step that calls an Extender Business Rule created in step one.
- Create data file with this name format: ID-ProfileName-ScenarioName-TimeName-LoadMethod.txt
- Load files to Batch\harvest folder. The access path is: System Tab|File Explorer|File Share|Applications|Application Name|Batch|Harvest
- Execute the batch. Once the files have been copied to the Harvest folder, execute the Data Management Sequence created in step two and the files will be processed.
- Scheduling Batch Processing. Batch file processing can run using the Windows Task Scheduler or any other scheduling tool an organization may use. This is accomplished by creating a PowerShell Script to execute a batch processing Data Management Sequence when called from the specific scheduling tool.
For Data Management in Oracle EPM Cloud, steps are:
- Create data file with name in this format: ID-Location-DLR-ScenarioName-TimeName-LoadMethod.txt
- Load file to the folder: Inbox\batches\OpenBatch
- Create an open batch
- Execute the batch
- You either can schedule the batch with the DM scheduling functionality or any other scheduling tools with script