OneStream Articles / Blogs / Perficient https://blogs.perficient.com/category/partners/onestream/ Expert Digital Insights Mon, 22 Apr 2024 23:37:27 +0000 en-US hourly 1 https://blogs.perficient.com/files/favicon-194x194-1-150x150.png OneStream Articles / Blogs / Perficient https://blogs.perficient.com/category/partners/onestream/ 32 32 30508587 OneStream Splash 2024 Las Vegas – Let’s Meet https://blogs.perficient.com/2024/04/08/onestream-splash-2024-las-vegas-lets-meet/ https://blogs.perficient.com/2024/04/08/onestream-splash-2024-las-vegas-lets-meet/#respond Mon, 08 Apr 2024 22:25:56 +0000 https://blogs.perficient.com/?p=361345

Splash, OneStream’s annual Global User Conference and Partner Summit, takes place May 20-23 in Las Vegas! OneStream is expecting 2,300+ attendees to converge on Las Vegas for four days of best practices, product updates, networking and hands-on workshops with finance and industry experts from around the globe. Perficient is proud to be a Splash sponsor and we’re looking forward to meeting you in-person at Caesars Forum Convention Center.

We are excited to announce we’ve been selected to present two great customer success stories at this year’s conference. Saju Philips, CPM Director at Perficient looks forward to sharing the stage with Jennifer Blankenship, Sr. Manager Consolidations at Hussmann and Charles Ramirez, CPM Director at Perficient will co-present with Kevin Hill, Director of Finance at RaceTrac.

The official schedule is now available! Learn more about Perficient sessions:

    • From Concept to Reality: CPM Blueprint’s Journey to Go Live at Hussmann 
      Wednesday, May 22, 11:30 AM – 12:15 PM
      Hussmann Corporation started implementation with use of the CPM Blueprint. This concept helped client with redesigning COA and preparing project team for implementation. Client successfully replaced HFM and customized FDM processes to ingest data from 14 plus ERP’s and improved their monthly close.

 

    • Fueling Change: RaceTrac’s Drive to CPM Transformation
      Thursday, May 23, 10:15 AM – 11:15 AM
      RaceTrac Inc. is an American corporation that operates a chain of service stations across the Southern United States and is listed among the largest private companies in the United States with an annual revenue of $16.2 billion in 2022. They embarked on an initiative to modernize their Corporate Performance Management systems by implementing a scalable Planning and Forecasting OneStream solution that will replace the legacy Excel-based system in place today.The OneStream solution provides the ability to plan at the store level that includes 500+ Stores and Corporate. Store level planning consists of Inside and Outside (Fuel by Grade) Margin with ability to forecast sales by product category, manufacturer, and brand. Historical trends allow forecasting based on RaceTrac’s historical sales and the ability to adjust at a detailed level for changes in the marketplace. Corporate costs are also forecasted and automatically allocated back to individual stores. The OneStream solution provided RaceTrac with a standardized and centralized solution that reduced the amount of time and resources required to complete the forecasting process. This has resulted in more comprehensive and efficient reporting capabilities, which in turn will support better decision-making allowing RaceTrac to thrive in today’s changing market.

“Every year Splash brings together finance professionals across the globe looking to modernize with OneStream,” said Kyla Faust alliance manager. “OneStream customers and prospects will learn how partners like Perficient leverage the power of the platform to bring transformational change. We are truly honored to have been selected to share the amazing Hussmann and RaceTrac success stories with fellow OneStreamers.”

Meet with subject matter experts from Perficient and learn how we’ve leveraged our extensive expertise in Corporate Performance Management to drive digital transformation for our customers. Our CPM practice is part of the broader Perficient, a leading digital consultancy serving customers throughout North America with domain expertise in a wide array of technology platforms. If you have needs beyond CPM, we can help.

As a Diamond level partner, the highest level in the partner ecosystem, you can count on Perficient to help you maximize your return on investment in OneStream.  In addition to having attained Diamond status, Perficient, in partnership with Keyteach, is one of only two OneStream Authorized Training partners in the United States. We deliver instructor-led training from our state-of-the-art training facilities in Houston, TX.

If you’re not able to attend the event, but would like to learn more about any of the topics listed above or more about our OneStream practice, please reach out to us.

]]>
https://blogs.perficient.com/2024/04/08/onestream-splash-2024-las-vegas-lets-meet/feed/ 0 361345
Identifying & Deletion of Orphan Members in OneStream via simple Excel hacks https://blogs.perficient.com/2024/03/08/identifying-deletion-of-orphan-members-in-onestream-via-simple-excel-hacks/ https://blogs.perficient.com/2024/03/08/identifying-deletion-of-orphan-members-in-onestream-via-simple-excel-hacks/#respond Fri, 08 Mar 2024 06:40:45 +0000 https://blogs.perficient.com/?p=358570

Background

Orphan members in OneStream are the members with no parent. Due to this, they are bit difficult to locate using Search Hierarchy feature, since technically they do not sit anywhere in hierarchy. They even do not get captured in grid view. Sometimes, an organization might want to delete them as they are no longer required or maybe align them back into appropriate location in hierarchy. This blog focuses on simple Excel & Notepad++ based techniques, to populate the list of orphan members, and deleting them (if required).

 

Tools Required

Technique shared in the blog requires 2 software:

  1. Microsoft Excel
  2. Notepad++

Notepad++ is an open-source software available free of cost. Microsoft 365 version of Excel is required as this technique uses a function named TEXTSPLIT( ) which was rolled out for 365 version of Excel. Alternatively, Excel for Web can also be used which is always updated, and available via free sign-up of Microsoft Account.

 

Case Study Showcase

Below is the screenshot of UD1 hierarchy for Product brand-wise.

Ud1 Hierarchy Showcase

100+ members have become Orphans node, as their relationship was removed to park them outside.

Ud1 Orphans

These members were created to incorporate entire catalog but were never purchased / sold. Organization wants to delete them permanently to keep data lightweight for better performance. Deleting 100+ members one-by-one would be herculean task, wasting hours of efforts. Let’s see some simple hacks, to populate list of orphan members. Once the list gets populated, those members can be deleted or re-aligned as desired, using Load/Extract feature.

 

Population of Orphan Member list

Following are the steps to derive & populate orphan member list:

  1. Go to Application > Load/Extract > Extract. In the dropdown, select Metadata. In the Metadata hierarchy, select the desired dimension. Click extract button which will open file Save As dialog, to save the XML file.
    Ud1 Extract Xml
  2. Open the file using Notepad++ (right-click XML file & choose Edit with Notepad++).
  3. Next, we will select value of name field of all the members from XML present in the dimension (which includes orphans too). In the Notepad++ go to menu Search > Mark. In Search Mode, select Regular Expression. In Find what textbox, write the pattern member name=”[\w\s_-]+” and then click Mark All button. This will mark text from member tag with value in name XML attribute.
    (text matching the pattern is highlighted / marked with red background color)
    Pattern Reference: \w = any text/number \s = any space _ = any underscore – = any hyphen + = one or more instances of these characters
    Npp Mark Regex
  4. Click Copy Marked Text button, to copy all pattern matching textual instances.
  5. Open Microsoft Excel, create 2 worksheets named member and relationship. Paste the copied text into worksheet member (as demonstrated below in cell A2)
    Excel Member Xml Copypaste
  6. In the cell B2, write following formula expression =TEXTSPLIT(A2, “”””) [refer below screenshot – formula], which will split text based on double quotes. The split will span to 3 cells since member name value is enclosed in two double quotes. Copy-paste this formula expression for all the cells (make use of fill handle) This will produce a list of all members (including orphan) in column C.
    Excel Member Textsplit Formula
  7. Get back to Notepad++ & press Clear all marks, button. Place cursor in the first line.
  8. This time, again invoke Mark dialog as illustrate in step 3 and perform same actions, with expression as parent=”[\w\s_-]+” child=”[\w\s_-]+” which will select parent/child text from relationship part of XML (refer below screenshot obtained to verify accuracy of selection by scrolling to the end of document)
    Npp Mark Regex Relationship
  9. Repeat the step no 4 to 6, this time copy-pasting text into worksheet relationship as demonstrated in below screenshot.
    Excel Relationship Textsplit Formula
  10. Column E contains child member and C contains name of parent, representing hierarchy. Technically, every member (excluding root) from worksheet member, should be populated in this list in column E. Member mapped into alternative hierarchy, would be populated multiple times in this list. But an orphan member would never find a place here since it is missing on relationship totally. So, we will setup a VLOOKUP to carve out such instances, as illustrated below in cell F2 with formula expression as =VLOOKUP(C2,relationship!E:E,1,0) and copy-paste it till last row.
    Excel Relationship Vlookup
  11. Apply filter via menu Data > Filter, and check for #N/A in in value filter (i.e. values not found in relationship), which denotes orphan members.
    Excel Relationship Filter Na
  12. Below is the result of Orphan members (refer below screenshot)
    Orphan Member Excel List

 

Deletion of Members

The orphan members identified in above steps, can be deleted (if required), by a simple Excel-based hack. Below are the steps:

  1. Extend following columns with formulas as shown in the screenshot below, to generate Member deletion XML. Copy-paste formula till last row. Grab the formula text from below to avoid typing mistake.
    =”<member name=”””&C691&””” displayMemberGroup=””Everyone”” action=””delete””></member>”
    Excel Member Deletion Xml
  2. Create a new file in Notepad++, copy-paste XML header / footer from exported XML file, as shown in below screenshot, which serves as bare-bones for deletion lines.
    Xml Bare Bones
  3. Copy-paste Deletion XML lines generated via Excel formula for all the orphan members in this XML file between <members></members> XML tags, as shown in below sample screen-clipping for 4-5 members.
    Xml Deletion Lines
  4. Save the above file in Notepad++
  5. Import this file in OneStream via Load/Extract menu, which would delete those orphan members.

 

Precautions

  • Always backup entire Dimension hierarchy via Load/Extract menu by exporting it to XML file. This backup XML can be used to revert deletion, provided no further changes were made to hierarchy post deletion.
  • Plenty of online backup storage are available for cheap price. It is safe to upload multiple versions of hierarchy backup XML file every time such modifications are made, which might come to the rescue in future.
  • It is wise to double-check member list being deleted. OneStream does not provide Undo button like Excel / Word.
  • Do not forgot the test things in Development application first and then deploy to Production.

 

Additional Notes

  • This approach assumes that member name consists of letters / number / underscore / hyphen. Any other character used in name would be required to be included in regular expression search pattern.
  • Orphan member deletion would fail if any data for it is found loaded. Kindly ensure this before running deletion XML.
  • The trick of deletion is generic & works for any member (irrespective of it is orphan or not)
]]>
https://blogs.perficient.com/2024/03/08/identifying-deletion-of-orphan-members-in-onestream-via-simple-excel-hacks/feed/ 0 358570
OneStream Security: How to Check Security Groups Assigned/Used or Not Used https://blogs.perficient.com/2024/02/29/onestream-security-groups-users-load-extract-option-xml-find-and-search/ https://blogs.perficient.com/2024/02/29/onestream-security-groups-users-load-extract-option-xml-find-and-search/#respond Thu, 29 Feb 2024 16:44:24 +0000 https://blogs.perficient.com/?p=357306

During the implementation of OneStream application, Implementation Consultants and Administrators create different security groups based on the client’s security access requirements. However, as the project progresses from one phase to another, these security groups may become obsolete or redundant.

Security groups are assigned to different objects in the application to enable users to access certain sets of data or reports, or execute tasks depending on the data access required to perform their tasks.

This article aims to help you identify the security groups assigned or used or not used within the OneStream application.

    • Login To OneStream Application as Administrator
    • Navigate to or Click on System Tab
    • Click on Security – Find a Group/User
    • Click on “Show All Groups in the Selected Group” – This will show you which groups are assigned to which Child/Parent Group if you select a group.
    • or “Show All Parent Groups for Selected User” – This will show you which user is assigned to which Child/Parent group.

If a security group is not assigned to a user or a parent group, it denotes that the group is not being utilized in the application.

Security groups are usually assigned to various objects within the OneStream Application, as listed below:

    1. Security Roles (Application/Security Roles)
    2. Dimensions (Application/Dimension Page – All Dimensions – Entity, Account, Scenario, Flow and User Defined (1-8)) 
    3. Cube Properties (Application/Cube/Cube Properties)
    4. Cube Data Access (Application/Cube/Cube Data Access)
    5. Workflow Profiles (Application/Workflow Profiles)
    6. Confirmation Rules 
    7. Certification Questions
    8. Data Sources
    9. Transformation Rules
    10. Form Templates
    11. Journal Templates
    12. Cube Views
    13. Dashboards

Security can be defined using several methods, such as Security Roles, Entity Security, Cube Security, and Workflow Security. However, to run a report, security group assignment must be applied to Cube View Profiles and Dashboard Profiles. Confirming security group assignment requires the Administrator to check all of the above, and it can be time-consuming to find where a security group is assigned. To simplify this search, the Administrator can use the following workaround or method:

    1. Log into Application
    2. Click on Application Tab
    3. Click on Tools Section
    4. Click on Load/Extract option.
    5. Click on Extract / Select an option from drop down list.
    6. Click on Dimension/Select a specific Dimension/ ex: Entity. or any other metadata object like Cube, Account, UD Dimensions)
    7. Click on Extract option on the header bar and save the file to your computer.
    8. Open the saved Xml file using an editor (Notepad or Notepad++ utility). Tip: Uncheck Wrap Text option
    9. Simply Search and Find the group in question.
    10. If you come across a group being used in the XML file as described below, it means the Security Group is being utilized. Verify the assignment and ensure that it is correctly assigned to users and controls data access.
      • “AccessGroup=”Everyone” (used in Cube View Profiles, Dashboard Profiles, Data Sources, Transformation Rule Profiles)
      • “maintenanceGroup=”Everyone” (used in Cube View Profiles, Dashboard Profiles, Data Sources, Transformation Rule Profiles)
      • “displayMemberGroup=”Everyone” (Entity, Account, Flow and UD Dimensions)
      • “readDataGroup=”Everyone” (used in Entity)
      • “readDataGroup2=”Nobody” (used in Entity)
      • “readWriteDataGroup=”Everyone” (used in Entity)
      • “readWriteDataGroup2=”Nobody” (used in Entity)
      • “ManageDataGroup=”Administrators” (used in Scenario)
      • “CalculateFromGridsGroup” value=”Everyone” (used in Scenario)

Each Security Group assigned will appear in the above group assignments. If a group does not exist in any of the above thirteen application setups, then one can safely assume that the security group in question is redundant and not being used. Before deactivating a group, make sure the group is unassigned from users and then take proper action to deactivate.

Note:

Please follow these guidelines when defining security groups:

  1. Use a standard naming convention that includes a prefix to identify groups specific to your client needs. For example, use WFE and WFC to denote Workflow Execution profiles and Certification profiles, respectively.
  2. Add your client’s company name (three-digit) as a prefix to ensure that the security group name is unique.
  3. Exercise caution when using the Extract feature to store data on your computer.
  4. Do not click on the Extract and Edit Button while extracting data, as this may result in saving an XML file with incorrect changes.
]]>
https://blogs.perficient.com/2024/02/29/onestream-security-groups-users-load-extract-option-xml-find-and-search/feed/ 0 357306
How to Post a Multiperiod Journal in OneStream https://blogs.perficient.com/2023/12/12/how-to-post-a-multiperiod-journal-in-onestream/ https://blogs.perficient.com/2023/12/12/how-to-post-a-multiperiod-journal-in-onestream/#respond Tue, 12 Dec 2023 19:43:07 +0000 https://blogs.perficient.com/?p=351367

One of our client requirements was to post a multiperiod journal. In the Consolidation process, this kind of request is uncommon; however, this request was from a Planning process perspective. In the following document, I describe the steps to create a multiperiod journal with OneStream XF.

Create a Journal Template

Navigate to and select Application -> Data Collection -> Journal Template and create a journal template.

Picture1

The first step is creating a Journal Group to organize the journals better. Click on the two blue dots to create a Group.

Picture2

Add a name and description, define the security (Access Group and Maintenance Group), and save.

Picture3

Then, we need to create the journal template. Click on the “Create Journal template” icon.

Picture4

Add a name and description, and fill out the journal template depending on the requirements. Choose the journal template type. “A Journal template can either be a Standard or Auto Approved. If it is auto-approved, a user can create a Journal from the template with limited editing ability, meaning permission to change name, description, and so forth.1

Picture5

For illustration purposes, we left the journal template Type as “Auto Approved.”

To add a dimension, go to the Point of View section. Select the dimension you want to add, and click on the ellipsis.

Picture6

Select (Default) member and click “OK,” and repeat for each dimension.

Picture7

After the selections, you can see the dimensions displayed in each journal line.

Picture8

Picture9

Once we have created the Journal template, we must create a Journal Profile. For that, click on the three dots icon.

Picture10

Add a name and description, define the security (Access Group and Maintenance Group), and save.

Picture11

Once we have created the Journal Template profile, we need to add the journal template to the Journal profile. Select the Profile and click the “Manage Profile Members” icon to do so.

Picture12

In the profile builder, select the group you want to add to the Profile and click “Add item.”

Picture13

Once you have added the group, click “OK.”

Picture14

Now, you will see the journal template in the Profile.

Picture15

Navigate to and select Application -> Workflow -> Workflow profile and select the Workflow “WF” where you want to create the journal template profile.

Picture16

Once you select the WF, activate the step by selecting TRUE in “Profile Active.”  In the Setting section of the WF, assign the Journal Template Profile Name to the WF on the “Journal Template Profile Name” line in the Journal Settings section.

Picture17

Create a Journal CVS Template

Open an MS Excel file. In the first two rows of Column A, create two Row Type Parameters specifying the Header and the Detail !RowType (H=Header) and !RowType (D=Detail). These tags the corresponding rows with H or D, identifying the Header and Detailed information in the CSV file.

Picture18

Then, we need to enter the Journal Column Headers. The required headers are the following (make sure you put each title in one specific cell):

JournalName, OriginiatingTemplateName, JournalDescription, JournalType, JournalBalanceType, IsSingleEntity, EntityMemberFilter, ConsName, WFProfileName, WFScenarioName, WFTimeName, and CubeTimeName.

Picture19

Add the detail in row two, and enter the Journal Detail Headers. The required Journal Detail Headers are:

JournalName, CubeName, EntityName, ParentName, AccountName, FlowName, ICName, all UDNames, DebitAmount, CreditAmount, and LineDescription.

Picture20

The file will identify the rows depending on the first column tags. H = Header or D = Detail.

Picture21

NOTE: You must add an H Row (Header) for each journal you want to post.

In this case, I want to affect two periods (2021M6 and 2021M7).

Picture22

Once you have created the MS Excel file, save it as CVS (Comma Delimited).

Picture23

Load the Journal via WF

Go to OnePlace and select the WFProfile, Scenario, and Time member to load the journal. In this case, FIN_Admin, Budget2021, 2021

Picture24

Create the journal using the .cvs file.

Picture25

Select the journal CSV file to load (make sure you can see “All files.) and click Open.

Picture26

Once you load the .cvs file in the journal step, OneStream creates two journals.

Picture27

Click the check box to Select ALL journals, and click on post.  Picture28

You will be able to post all the journals with a single post.

Note: Depending on security, you could create, upload, and post the journal, or different persons can manage this separately, e.g., one can upload, and another can post it.

Picture29

Click “OK” to confirm the posting.

Picture30

Create a CubeView “CV” or QuickView to validate data. In this case, I used a CV.

Picture31

Reference

  1. Design and Reference > Data Collection > Journal Templates > Journal Template Properties
]]>
https://blogs.perficient.com/2023/12/12/how-to-post-a-multiperiod-journal-in-onestream/feed/ 0 351367
Preparing for Direct Connection to OneStream https://blogs.perficient.com/2023/11/03/preparing-for-direct-connection-to-onestream/ https://blogs.perficient.com/2023/11/03/preparing-for-direct-connection-to-onestream/#respond Fri, 03 Nov 2023 14:24:53 +0000 https://blogs.perficient.com/?p=348340

One of the most common if not the most common requests when setting up OneStream is to have a direct connection from an ERP such as Oracle, Peoplesoft, and/ or Netsuite to OneStream to allow for data to be automatically uploaded on a regular basis.  Below is a link from one of my fellow Perficient consultants that describes the steps and how to setup a Direct connection to OneStream:

Oracle EBS Direct Connect Configuration in OneStream / Blogs / Perficient

This blog is going to focus on the steps that can be done before setting up the direct connection to OneStream and should be done before creating the connector business rule.  This blog is also primarily focused on an ODBC/ SQL connection.  If you follow these steps the direct connection would be setup in the shortest time possible with the lowest possible consultant hours/ dollars spent:

  1. Create a query from your ERP using Microsoft SQL server or another program to get a copy of the data that would go into OneStream.
  2. Check that the data ties to the numbers you are expecting in the ERP.
  3. Set up the direct connection to OneStream
  4. Create and run a SQL data adapter in OneStream and tie the data to the ERP.

Steps 1 and 2 do not require a OneStream consultant and should be done by someone who is an expert with that ERP.  Once these steps are done, you are ready to setup the connection to OneStream with a connector business rule, transformation rules, workflow and load the data (see steps 4 to 8 of the attached blog).

Create a query from ERP

If the ERP database supports SQL, then the query would start with a SELECT statement that picks the fields needed and possibly some JOIN statements FROM a table or multiple tables with a WHERE clause.

For example, if all of the data is in 1 table the query might look like this:

SELECT

Entity,

Account,

Period,

Department,

Project,

Amount

FROM

Gl_table

WHERE

Period = ‘Jan 2023’

The results of a query like this may produce a table similar to this:

Entity         Account      Period          Department        Project    Amount

111                10000         Jan 2023       200                     155          1,000.00

111                 30000        Jan 2023       200                      NA         -1,000.00

Most likely your ERP has multiple tables.  In that case tables would have to be joined.  The queries that I have seen had multiple Inner, outer and Left Joins.  That is why it is so important to have someone that knows your ERP’s tables create the query needed.

Have someone create the necessary query from your ERP using Microsoft SQL server or some other program that is available to query the ERP.  This should be done by either someone in your IT department who is very familiar with the tables in your ERP  or a consultant who knows the tables in your ERP.  Someone in your IT department is not only the least expensive option but is also often the best qualified to create the query.

 

Check data

If the data is a small enough file have it sent as a comma delimited file and compare to the trial balance or reports currently being used in your ERP.   The data needs to tie exactly to your reports.  That way you can be sure that the data that will be going to OneStream is correct.  This way if there is a difference between OneStream and your ERP then it is not the query but something with the OneStream setup.  If you skip this step, then how can you be sure if the differences you have are the query or the OneStream setup?

 

Set up Direct Connection

There are several ways to setup the direct connection:

  1. Named connection: If your OneStream application is on-premise your setup would be similar to the one in the blog referenced above.  If OneStream is in the cloud, then  OneStream support would have to setup the connection since they are the ones that have access to the server.  Send the driver for your ERP and setup information to OneStream support.
  2. Smart Integration Connector (SIC): (available only for version 7.3 and higher) This is only available when software from OneStream is installed on a separate virtual machine of your own to communicate and transfer data between your ERP and OneStream.  The advantage  of the Smart Integration connector is that you control the login and password for the connection to OneStream.
  3. REST API connection: Representational State Transfer (REST) API is the preferred connection method for some clients.  This will require someone to write custom business rules for the connection to be successfully setup.

Create SQL data adapter

Setting up the SQL data adapter takes very little time and can even be put in a dashboard so the data can be downloaded.  Here are the steps in OneStream:

Create a blank dashboard maintenance unit:

Application DashboardDashboard Maintenance Unit

I created a new dashboard maintenance unit, called it “Test_Data_adapter” and it automatically adds all the types of objects needed to do a dashboard:

Test Dashboard Maintenance Unit

Add a Data adapter for your connection (this example has a named connection called “Netsuite”):

Data Adapter Object

Data Adapter Icon

Sql Data Adapter

Copy your SQL into the data adapter:

Sql Query Copy

Run the SQL:

Run Sql

Create a Grid View component:

Dashboard Component

Dashboard Component Button

Grid View Component

Add the SQL data adapter:

Add Data Adapter

Create a new dashboard and add the Grid View component:

Create Dashboard

Add Grid View

View dashboard:

View Dashboard

Export the data so that it can be tied:

Gl Transactions Sample

This setup should take less than 1/2 hour to do.  Once this data has been checked, you’re ready to create the connector business rule and data source.

]]>
https://blogs.perficient.com/2023/11/03/preparing-for-direct-connection-to-onestream/feed/ 0 348340
Key Takeaways from Splash 2023 – Part 2 https://blogs.perficient.com/2023/05/10/key-takeaways-from-splash-2023-part-2/ https://blogs.perficient.com/2023/05/10/key-takeaways-from-splash-2023-part-2/#respond Wed, 10 May 2023 22:08:49 +0000 https://blogs.perficient.com/?p=335158

Last week we published the first blog post in this two-part series sharing highlights of Splash,  OneStream’s Global User Conference and Partner Summit! We enjoyed four days of best practices, product updates, networking, and hands-on workshops with finance and industry experts!

We’ll pick up where we left off with comments from the Perficient team about highlights from our time at this inspirational event.

Hector M.

Attending the OneStream Splash conference was very exciting and a valuable and informative experience. OneStream’s unified platform streamlines financial consolidation, reporting, and planning processes, while providing valuable insights. The conference provided a roadmap for success by showcasing successful OneStream implementations and highlighted how to avoid common pitfalls. It was also an opportunity to connect with like-minded professionals and OneStream and industry experts. Finally, the conference provided a glimpse into the future of OneStream’s platform.

At this conference, there were several technical sessions for those looking to improve their OneStream application. One of the highlights of the conference was learning about how OneStream can be implemented for those looking to expand their ESG reporting capabilities. As ESG reporting becomes increasingly important for businesses, many organizations are seeking to effectively manage the process. In a presentation titled How to implement ESG and Align with Financial, attendees were able to learn how OneStream Software can help conquer ESG reporting. Here are the three key takeaways from this informative presentation:

OneStream Software simplifies ESG reporting

ESG reporting can be a complex and time-consuming process. The platform is designed to help organizations manage ESG data and streamline the reporting process. OneStream Software allows businesses to collect data from various sources and auto-populate the necessary reports. This can save businesses time and resources, while ensuring accuracy in reporting.

OneStream Software provides transparency

Transparency is a key aspect of ESG reporting, as stakeholders expect businesses to provide clear and accurate information about their environmental, social, and governance practices. OneStream Software provides transparency by allowing organizations to track and report on ESG metrics in real-time.

OneStream Software improves decision-making

It also provides businesses with the tools and insights they need to make informed decisions about their operations. The platform allows organizations to track and analyze data over time, identifying trends and patterns that can inform strategic decision-making.

Overall, attending Splash was a valuable experience that provided knowledge, insights, and networking opportunities. Attendees also connected with like-minded professionals and OneStream experts, expanding their networks, and learning from industry experts.

Jason G.

It was a great week at the OneStream Splash 2023 Conference in National Harbor, Washington, DC! After a three-year hiatus, having the Splash Conference back in full swing was very exciting. I know my Perficient colleagues, and I enjoyed meeting each other in person for the first time and conversing with potential and current customers. The annual user conference brings together OneStream users, partners, and experts worldwide for three full days of learning, networking, and fun. A few of the most valuable aspects of OneStream Splash are the opportunity to connect with other OneStream users and experts, share our experiences, learn from each other, and build lasting relationships with individuals in the CPM realm.

My favorite breakout session that I attended was Analytic Blend – Simplifying your OneStream Application while making the granular Details Available When Needed. OneStream Analytic Blend is a powerful tool that allows an organization to combine data from multiple sources, including OneStream cube data and external data sources such as ERP systems, data warehouses, and other systems. The breakout session covered the importance of utilizing Analytic Blend/Services to hone in on granular-level detail, such as product/SKU data. It also dove into some best practices for building analytical blend services.

I want to highlight a few of the key benefits of using the analytical services that OneStream provides:

First, it enables organizations to perform complex analytics on blended data, such as multidimensional analysis, ad-hoc reporting, and visual data exploration, all while maintaining a single source of truth system. This enables organizations to uncover insights and identify opportunities for improvement.

Second, unlike some ERP and CPM solutions, OneStream supports advanced data modeling capabilities, including creating alternate hierarchies, mappings, and calculations. This makes it easier to manage complex data structures and perform calculations on blended data.

With the ability to gain insight into this granular level of detail and report on KPIs that would otherwise be nearly impossible to report on in a timely manner, the decision-makers in an organization are able to view data in real-time and make more informed decisions, which can lead to improved performance and increased profitability.

Lastly, as AI and machine learning expand across companies, the analytical blend services that OneStream provides serve as a foundation for further enhancements.

Overall, OneStream Splash is a must-attend event for anyone who uses OneStream or is interested in learning more about CPM. The conference offers a unique combination of education, networking, and fun, all in a beautiful location. If you want to take your skills and knowledge to the next level, consider attending OneStream Splash 2024 in Las Vegas. We hope to see you there!

]]>
https://blogs.perficient.com/2023/05/10/key-takeaways-from-splash-2023-part-2/feed/ 0 335158
Key Takeaways from Splash 2023 – Part 1 https://blogs.perficient.com/2023/05/03/key-takeaways-from-splash-2023-part-1/ https://blogs.perficient.com/2023/05/03/key-takeaways-from-splash-2023-part-1/#respond Wed, 03 May 2023 20:57:35 +0000 https://blogs.perficient.com/?p=334451

So much time is spent planning for Splash, OneStream’s Global User Conference and Partner Summit and then it feels like it’s gone in the blink of an eye! Four days of best practices, product updates, networking, and hands-on workshops with finance and industry experts – what a fantastic event!

Onestream Splash

 

 

 

The Perficient team was in full force at Splash. We had two speaking sessions, one virtual and one in-person. This year we introduced OneSteamers to our partnership with RioBotz and their BattleBots competitor, Minotaur. Show attendees were able to engage in battle with Minotaur and its rival and seven lucky winners of Hexbug Rivals Kits were able to Take the Battle Home! As we look back on our time at Splash, the team wanted to share their thoughts with our readers. If you didn’t get a chance to attend Splash or even if you did, I’m sure you’ll enjoy highlights from the team.

 

Racheal C.

I really loved the venue for Splash 2023!  The Gaylord Hotel and Convention Center at National Harbor was positioned in a lovely location with views of our nation’s capital.  Food, entertainment, and additional accommodations were within walking distance.

Sharing the HNI success story at Splash with two finance professionals from HNI, Terra Simpson and Jennifer Curry was a highlight. More than 100 attendees packed the room to learn how HNI used the OneStream RCM Account Reconciliations solution, along with an enhancement developed by Perficient, to identify Cash Flow adjustments using Reconciling Item Types and automate the posting of these adjustments to Actuals.

My favorite takeaways from the week were from Designing for Performance and System Diagnostics.  In Designing for Performance,Level 2 Financial Model Rules the team laid out why the initial dimension design is so crucial.  Potential Data Unit size is determined by your dimensions and their members.  They made a strong case for the use of Extensibility to optimize performance.  At the other end of the spectrum, the System Diagnostics highlighted ways to identify opportunities to improve application performance.  The new System Diagnostics MarketPlace Solution deployed in early 2023 highlights the Top Application Metrics to focus on for improved performance.  Guess which was number 1?  Data Unit Size!  I could talk about this for days… Actually, I am talking about this for 2 days in May and would love to see you in the upcoming OneStream Level 2 Financial Model Rules course on May 23 and 24.

Steven D.

As a first-time conference attendee, I found it extremely informative and an overall great experience. I would recommend trying to attend at least once, both from a developer or customer standpoint. There are a variety of sessions and topics that will satisfy both beginners to advanced users. The opening keynote was a highlight for me and a great way to kick off the week as I was able to reminisce about the growth of OneStream and be inspired by future plans.

A few memorable takeaways come to mind as I think about Splash:

  • While demo sessions sounded like they would be great for technical partners, I found that attending customer sessions provided better value because they presented realistic challenges and some discussed how their thought process around designing the application the way they did.
  • Attending simple sessions like utilizing Hybrid/Entity Aggregation, which was a feature released for a while now, gave me ideas on how to design future applications to drive performance and efficiency. With a simple Scenario setting configuration, you’re able to set up a comparison report that can copy and aggregate data for selected accounts without having to write any business rules.
  • From the Omni Hotels use case, they emphasized how important it is to stress/performance test the system before going live especially when the user base will be in the 100+ count. Performing unit or UAT testing with a handful of concurrent users and opening up the application to 20 concurrent users will yield different results.

Ryan M.

First off the venue was great. All in one place and very convenient. The evening event Perficient held at Harbor Social was a big success and a lot of fun.  What I really liked most about Splash was the combination of seeing old colleagues, meeting new customers, and bonding with the Perficient team.  Working in a remote environment you lose that face-to-face contact and this helped to reinforce relationships.

Regarding Splash content, I really liked the idea of incorporating AI and Machine learning into the OneStream ecosystem.  I see it not only from an end product perspective, i.e., more dynamic and relevant forecasting but also from the perspective on how it can be used to make our jobs more efficient.  I also learned a lot about ESG reporting.  To me, I can see how this will generate a lot of opportunities for us and will continue to advance over time as this reporting becomes more defined and incorporated into corporate reporting.  They made note of how companies will begin to report and forecast ESG along with their quarterly financials eventually blending the two together where they can gleam opportunities to improve their ESG ratings over time.

Lastly, I found it very informative regarding the future of the OneStream ecosystem.  Moving to an open marketplace where partners can share and provide application enhancements presents an opportunity for Perficient to differentiate ourselves.  All in all, I had a great time at Splash, learned a lot, and look forward to going again.

]]>
https://blogs.perficient.com/2023/05/03/key-takeaways-from-splash-2023-part-1/feed/ 0 334451
Uploading Flat Files to a Custom Table in OneStream https://blogs.perficient.com/2023/04/25/uploading-flat-files-to-a-custom-table-in-onestream/ https://blogs.perficient.com/2023/04/25/uploading-flat-files-to-a-custom-table-in-onestream/#respond Tue, 25 Apr 2023 16:08:35 +0000 https://blogs.perficient.com/?p=333542

Table Data Manager:

OneStream has a Marketplace Solution called Table Data Manager that allows users to create custom tables where data can be loaded and updated.  See also this Perficient blog post on how to get started with Table Data Manager:

OneStream – Table Data Manager / Blogs / Perficient

Table Data Manager has a function/ button to import data into custom tables.  However, the import function in Table Data Manager only accepts files in XML format.   In this blog post I will show you how to load data to a custom table using a flat file such as a comma delimited (.csv) file instead of having to use XML format.

Create a Table:

Using the Table Data Manager instructions from the blog linked above I created this XFC_Test table:

Xfc Test Table

Flat File loaded to OneStream:

I created this flat file and saved as a comma delimited file:

Flat File

I then loaded the file to File Explorer under File Share/Applications/NHLBI_Blog_copy/Batch/Harvest:

File Explorer Test File

Note: NHLBI_Blog_copy was the name of my application.  Everything else would stay the same.  The file was loaded here since we won’t have to be concerned about OneStream having access to this folder.

Business rule to upload data:

I created this business rule to upload the comma delimited file:

Business Rule For Uploading

 

Then I made it an Extensibility rule and made it an unknown type so it could be run from the Business rules page:

Business Rule Type And Unknown Case

I developed this code using the Custom Table Load (Delim) Snippet in OneStream:

Business Rule Snippet For Comma Delimited File

To get the file path (line 34) I used the GetFileShareFolder snippet:

Getfilesharefolder Snippet

Uploading Data:

From the business rule click on Run:

Execute Business Rule

 

The business rule uploads the data and you will get a pop-up message that the business rule ran successfully:

Load Table Data Executed

Go back to Table Data Manager and View the data:

View Data

Tips and Tricks:

The column names in the business rule must match the column names in the custom table exactly.  Lines 39, 40 and 41 had the exact same column names in brackets [ ] as the column names in the custom table:

Column Names In Business Rule

The order of the columns in the business rule must match the order of the columns in the flat file.  For example, the first column referred to in the business rule was Department and the 1st column in the flat file was also Department:

Flat File Column Order

The business rule will replace all the data since the load method is set to replace:

Business Rule Load Method

 

 

]]>
https://blogs.perficient.com/2023/04/25/uploading-flat-files-to-a-custom-table-in-onestream/feed/ 0 333542
Linked Cube Views in OneStream and Drilling Down to Source Data https://blogs.perficient.com/2023/03/21/linked-cube-views-in-onestream-and-drilling-down-to-source-data/ https://blogs.perficient.com/2023/03/21/linked-cube-views-in-onestream-and-drilling-down-to-source-data/#respond Tue, 21 Mar 2023 16:05:28 +0000 https://blogs.perficient.com/?p=328300

Perficient was engaged with a client that wanted to be able to drill down from a summary report into the details in their Thing Planning database.  This was done in OneStream using linked cube views and bound parameters.  The project team created 2 separate drill downs:

  1. From summary information to detail within the cube.
  2. From detail within the cube to individual transactions within the Thing Planning database.

The project team set this up using:

  1. Bound parameters
  2. Linking cube views
  3. Creating a dashboard with both cube views
  4. Adding another dashboard with the detail from Thing Planning

Bound Parameters:

Both drill downs were done using Bound Parameters in the Cube View.  Bound Parameters are setup in the cube view under General Settings/navigation links.:

Parent Cube View Nav Links

Bound Parameter names are entered under each dimension that needs to be drilled down on.  Any combination of numbers, letters and underscores (_) can be entered as a bound parameter name (No special characters or spaces).   For example, the project team came up with “ParentClick_UD1” for the UD1 dimension:

Parent Cube View Ud1 Parameter

Linking Cube Views:

Select the cube view to link to in the Linked Cube Views section:

Parent Cube View Linked Cube View

The Bound Parameter is used as a parameter in the linked cube view by enclosing the name with solid lines and exclamation points (|!).  The project team put the parameter on the Point of View (POV) in the linked cube view:

Detail Cube View Pov

We used the variable |CVUD1| in the total and detail rows:

Detail Cube View Total Row

Detail Cube View Rows

This allows the UD1 member to be updated in 1 place (point of view) instead of on each row.

Run the Summary Cube View and it looks like this:

Summary Cube View Report

Right-click on a cell and Navigate to the linked cube view (Detail_Cube_View):

Summary Cube View Report Right Click

To see the detail in the cube:

Detail Cube View Report

Parent to Detail in a dashboard:

Attach the cube views in Cube View dashboard components:

Parent Cv Component

Detail Cv Component

Put the cube view components into separate dashboards:

Parent Cv Dashboard

Detail Cv Dashboard

Set the detail cube view to refresh and redraw the detail dashboard:

Refresh Detail Cube View

Combine the 2 dashboards into 1 dashboard:

Combined Parent Detail Dashboards

Run the combined dashboard:

Parent And Detail Dashboard

The detail cube view will now update based on the number that the user clicks on in the parent cube view/ dashboard:

Combined Parent Detail Dashboards Refresh

Adding detail from Thing Planning:

Setup the bound parameter for Scenario, Time and UD1 on the Detail Cube view:

Detail Cube View Bound Parameters
The project team copied the SQL Table Editor Register component from Thing Planning to another dashboard:

Sql Table Editor Register

Sql Table Editor Register Copy

Sql Table Editor Dashboard

Used the Bound parameters from the Detail Cube View in the Where clause on the SQL table editor:

Where Clause

Set the Thing Planning dashboard to refresh when the Detail Cube View component is clicked on:

Detail Cube View Refresh

Note: 3_SourceDrill contains the 3b_Source_Grants dashboard and they will both be refreshed.

When clicking on a number in the detail cube view the detail appears in the table below it:

Drilldown Grants

]]>
https://blogs.perficient.com/2023/03/21/linked-cube-views-in-onestream-and-drilling-down-to-source-data/feed/ 0 328300
Pivot and Submit Data through a Spreadsheet using Table Viewer in OneStream https://blogs.perficient.com/2023/03/14/pivot-and-submit-data-through-a-spreadsheet-using-table-viewer-in-onestream/ https://blogs.perficient.com/2023/03/14/pivot-and-submit-data-through-a-spreadsheet-using-table-viewer-in-onestream/#respond Tue, 14 Mar 2023 17:50:00 +0000 https://blogs.perficient.com/?p=327912

Thing Planning Solution

Perficient was recently engaged with a client that was implementing a Thing Planning solution to manage thousands of line items for their planning process.  The client wanted to be able to look at a small set of data in a form and submit the changes to just those rows without having to go through the entire workflow process.  Their items also had 7 years of data with a row for each year and needed to pivot on the years so that they were in columns.

Here’s a sample of some dummy data in Thing Planning:

Sample Thing Planning Data

Spreadsheet Business Rule

Table Viewer in OneStream allows you to bring this data into a Spreadsheet and change the data.  There are 3 main functions of the Spreadsheet business rule:

  1. Set your variables  (Case Is = SpreadsheetFunctionType.GetCustomSubstVarsInUse)
  2. Get the Table View (Case Is = SpreadsheetFunctionType.GetTableView)
  3. Save/ update the Table View (Case Is = SpreadsheetFunctionType.SaveTableView)

The project team setup these Private functions under each main function:

  1.  “Filters” for Set your variables.
  2. “GetProjectPlanningDetails” for getting the table view
  3. “UpdateProjectPlanning” for saving the table view

Spreadsheet Functions Snip

The client needed 3 variables to fill in for the form: Category, SubCategory, CouncilRound.  These were setup in a list in setting the variables:

Setting Variables

The project team pivoted the data to put the years in rows to columns in the Get Table View function using SQL on the Thing Planning Table:

Gettableview Function

The save table function is where column names are set to a TableViewColumn:

Save Tableview Column Names

The .IsDirty function is used to check to see if there’s a change to the value in a column and update the table if there is a change:

Savetableview Update Table

 

Once the spreadsheet business rule is setup, add the table to a Spreadsheet in OneStream:

Spreadsheet Tableview

Add a Table View Business rule:

Spreadsheet Add Table View Bus Rule

 

Select the Business rule that was created:

Save Tableview Business Rule Select

 

Named ranges were also used with the same name as our variables.  For example, cell A3 is named “CouncilRound”

Namedranges For Variables

The values we wanted for those variables were entered in those named ranges.  If named ranges were not used for the variables then there would be a pop-up box for the user to enter the values for the variables.

Once Refresh sheet is clicked the data will appear from the Thing Planning table:

Refreshedtableview

 

The table updates after making a change.  In the example below the 1st value was changed by $2 and submit was clicked:

Submitted Table

 

]]>
https://blogs.perficient.com/2023/03/14/pivot-and-submit-data-through-a-spreadsheet-using-table-viewer-in-onestream/feed/ 0 327912
Getting OneStream Badge-Preparation for Beginner Level https://blogs.perficient.com/2023/03/09/getting-onestream-badge-preparation-for-beginner-level/ https://blogs.perficient.com/2023/03/09/getting-onestream-badge-preparation-for-beginner-level/#respond Thu, 09 Mar 2023 13:36:24 +0000 https://blogs.perficient.com/?p=329859

Coming from IBM Planning Analytics (TM1) and Anaplan background OneStream’s first-hand experience was quite distinctive and interesting.

There are no prerequisites to have experience in any EPM tool but it provides a slight edge.

Anyone with technical / finance background can learn this tool.

As compared to other EPM tools OneStream has the unique capability of both Financial Consolidation and Planning in the same platform.

I started initially with OneStream Essentials-Getting Started with OneStream. This module provides basic information about the tool as well capability of the tool. This model was helpful to continue the certification journey.

Below are 3 certifications/badges for the beginner level. All this course needs is between 20-30 hours to complete.

  1. OneStream Essentials: Implementing OneStream
  2. OneStream Architect Design an application
  3. OneStream Essentials: Building Basic Reports

How will this course help you?

One of the main benefits I found through taking this course was to follow best practices. who earned this certification have demonstrated the knowledge to analyze customer requirements and initiate the design of a OneStream application.

Additionally, earning OneStream Badge demonstrates key industry knowledge to customers and can help differentiate employees in today’s competitive world.

OneStream Essentials: Implementing OneStream

This module covers key components of tools and builds the application along with best practices. I spent around 3 hours per day absorbing the concept. This model is broken out into multiple chapters. I was able to cover 1-2 chapters each day. Each chapter has a hands-on Lab and test which needs to be completed before moving on to the next chapter. Over 12 assessment tests are included in these modules. The questions can be a little tricky, but completing LAB / hands-on exercises helped me a lot to tackle the questions.

OneStream Architect: Design an application

Learning Path provides students with access to metadata, workflow, and model design-related courses. Using a variety of interactive tools, students can navigate through a series of lessons and activities that enable them to comprehend the key elements of a project’s design phase. This module was made available only after clearing the pre-assessment test. The tests followed by each chapter consist of 12 assessment tests.

OneStream Essentials: Building Basic Reports

This module covers Integrating with different data sets. The module contains 16 assessment tests.

Keys to Successfully Completing the Certificate:

  • Make notes of key points and complete the practical lab before you take the test after each chapter.
  • Don’t try to complete them in one or two days rather break these into a week.
  • During the test please refer to your Lab exercises which will help you to answer the questions.
  • 7 to 10 questions are to be answered after each chapter.
  • During each chapter, they have a few quick questions to check your understanding which will help to prepare for the final test.

Certified Badge:

Once the assessments are completed for each module you will receive your Badge through Credly within a week of completion, which can be shared on social media.

All the Best!

Sachin Bongale

Badge URL:

https://www.credly.com/badges/364e5096-edc4-4306-bb9c-a4280b387710/public_url

I have 15 plus years of experience in EPM domain working in end-to-end implementation of tools like IBM planning analytics, PBCS/EPBCS, and Anaplan.

I am a certified Scrum Master, PMP, OneStream, Hyperion EPM, Anaplan, and IBM Planning Analytics.

Certified Chartered Accountant with strong functional knowledge of financial statements, budgeting, forecast & planning processes.

 

]]>
https://blogs.perficient.com/2023/03/09/getting-onestream-badge-preparation-for-beginner-level/feed/ 0 329859
How to Query & Extract data from OneStream metadata XML using XPath & XSLT https://blogs.perficient.com/2023/02/22/how-to-convert-onestream-metadata-xml-into-csv-without-any-vb-program/ https://blogs.perficient.com/2023/02/22/how-to-convert-onestream-metadata-xml-into-csv-without-any-vb-program/#respond Wed, 22 Feb 2023 09:35:27 +0000 https://blogs.perficient.com/?p=328477

Background

OneStream supports exporting metadata into XML file for backup and restore purpose (via menu Application > Tools > Load/Extract). This blog covers technique to extract this information from metadata XML using technology named XSLT (eXtensible Stylesheet Language Transformation), which can read XML hierarchy & extract information from it.

 

Tools Required

Microsoft Visual Studio supports creating/editing XML & XSLT files, with in-built intellisense (auto-complete) and a validator, which checks for correctness of XSLT file. Visual Studio comes with XSLT processor for handy XML transformations which developers might require. Microsoft offers Community Edition of Visual Studio, available freely, suitable for light-weight development & tasks.

 

Data at a Glance

Below is the demo Account Member hierarchy which we shall be extracting from XML (screenshot below)

Account Member Hierarchy

Below is the screenshot of Metadata XML as appearing in Visual Studio, extracted via Load/Extract menu

Metadata Xml Vs Screenshot

 

Understanding XPath

XML file contains hierarchal data. Querying tree structure data is a tricky task compared to tabular data, which can be queried easily using SQL. XPath is used to query XML data.

Let’s see, we want to query description value of Account member 1001. Below will be the XPath expression for this

/OneStreamXF/metadataRoot/dimensions/dimension[@type=’Account’]/members/member[@name=’1001′]/@description

XML tags are represented by /tag and XML attributes by @attribute. XPath supports filtering of data, by specifying query condition in square braces for that tag.

 

Drafting XSLT

XSLT is used for querying & transforming data from XML file, and generating output in XML or text format. XSLT is written using XML itself, with few XML tags instructing how to transform data. Visual Studio ships XSLT processor, capable of executing on-the-fly transformations via GUI menu. Below is the demo XSLT file, which extracts data from above XML file & generates textual output in Tab delimited format, which can be dumped into Excel or even imported into SQL Server easily.

Xslt Xml

XSLT can be copy-paste from below

<?xml version="1.0" encoding="utf-8"?>
<xsl:stylesheet version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform"
    xmlns:msxsl="urn:schemas-microsoft-com:xslt" exclude-result-prefixes="msxsl"
>
    <xsl:output method="text" indent="yes"/>

    <xsl:template match="/">
        <xsl:text>Dimension&#x9;Name&#x9;Description&#x9;Account Type&#xA;</xsl:text>
        <xsl:for-each select="OneStreamXF/metadataRoot/dimensions/dimension[@type='Account']/members/member">
            <xsl:value-of select="../../@name"/>
            <xsl:text>&#x9;</xsl:text>
            <xsl:value-of select="@name"/>
            <xsl:text>&#x9;</xsl:text>
            <xsl:value-of select="@description"/>
            <xsl:text>&#x9;</xsl:text>
            <xsl:value-of select="./properties/property[@name='AccountType']/@value"/>
            <xsl:text>&#xA;</xsl:text>
        </xsl:for-each>
    </xsl:template>
</xsl:stylesheet>

Let’s understand various parts of XSLT:

 

<xsl:output method=”text” indent=”yes”/>

This instruct XSLT to generate output in textual format

 

<xsl:text>Dimension&#x9;Name&#x9;Description&#x9;Account Type&#xA;</xsl:text>

This line will insert a static column header in output file. XSLT being XML internally, needs escaping of tab (&#x9;) & newline (&#xA;)

 

<xsl:for-each select=”OneStreamXF/metadataRoot/dimensions/dimension[@type=’Account’]/members/member”>

Above XSLT line, runs a for-each loop of all the member under the dimension which are of type Account.

 

<xsl:value-of select=”@description”/>

This line emits content of description attribute of member tag from XML.

<xsl:value-of select=”../../@name”/>

../ is XPath expression to fetch value of relative parent. So we are going 2 levels up, to then dimension XML node and then extracting value of name attribute from it.

<xsl:value-of select=”./properties/property[@name=’AccountType’]/@value”/>

./ is XPath expression to extract values from relative child XML node.

 

Generating Output File

Steps to generate textual output file

  1. Open XSLT file in Visual Studio
  2. Go to Properties window and browse XML file in Input and specify location of Output file in Output browse section
    Xslt Input Output File Browse
  3. Navigate to XML > Start XSLT without Debugging
    Xslt Input Output File Browse
  4. This will generate & save Output file and open it in Visual Studio
    Xslt Output
  5. This content can be copy-pasted into Excel or even imported into Database using BULK INSERT statement
    Xslt Excel

 

Other Benefits

  • It can filter & extract data from entire backup metadata XML contains multiple Dimensions like Account, Entity etc.
  • This approach is not limited to Accounts Dimension, but works for all the Dimensions like Entity, Scenario, etc by just changing XPath filter to [@type=’Entity’] and so on.
  • This approach can be extended to pull even further columns like IsIC, etc
  • Multiple for-each loop can be initiated in single XSLT file to scan all Dimensions like Account, Entity, Scenario, etc to generate consolidated output to be upload into some Database or Data Lake.
  • XSLT transformation can be automated via C#/VBA program (in .NET using XslCompiledTransform), or by invoking XSLT compiler from command-line Check this MSDN tutorial

 

Conclusion

XML is globally used for data interchange, enjoying compatibility with majority of software. OneStream highly leverages XML to backup almost every object / artifact available in the system. Objective of this blog is not just to perform specific task covered in case study, but to gain basic understanding of concept of XPath & XSLT. With a good command over XSLT, one can apply this technique even to re-create XML files with bulk modifications. Endless possibilities exists with varied business use cases one can think of.

]]>
https://blogs.perficient.com/2023/02/22/how-to-convert-onestream-metadata-xml-into-csv-without-any-vb-program/feed/ 0 328477