Splash, OneStream’s annual Global User Conference and Partner Summit, takes place May 20-23 in Las Vegas! OneStream is expecting 2,300+ attendees to converge on Las Vegas for four days of best practices, product updates, networking and hands-on workshops with finance and industry experts from around the globe. Perficient is proud to be a Splash sponsor and we’re looking forward to meeting you in-person at Caesars Forum Convention Center.
We are excited to announce we’ve been selected to present two great customer success stories at this year’s conference. Saju Philips, CPM Director at Perficient looks forward to sharing the stage with Jennifer Blankenship, Sr. Manager Consolidations at Hussmann and Charles Ramirez, CPM Director at Perficient will co-present with Kevin Hill, Director of Finance at RaceTrac.
The official schedule is now available! Learn more about Perficient sessions:
“Every year Splash brings together finance professionals across the globe looking to modernize with OneStream,” said Kyla Faust alliance manager. “OneStream customers and prospects will learn how partners like Perficient leverage the power of the platform to bring transformational change. We are truly honored to have been selected to share the amazing Hussmann and RaceTrac success stories with fellow OneStreamers.”
Meet with subject matter experts from Perficient and learn how we’ve leveraged our extensive expertise in Corporate Performance Management to drive digital transformation for our customers. Our CPM practice is part of the broader Perficient, a leading digital consultancy serving customers throughout North America with domain expertise in a wide array of technology platforms. If you have needs beyond CPM, we can help.
As a Diamond level partner, the highest level in the partner ecosystem, you can count on Perficient to help you maximize your return on investment in OneStream. In addition to having attained Diamond status, Perficient, in partnership with Keyteach, is one of only two OneStream Authorized Training partners in the United States. We deliver instructor-led training from our state-of-the-art training facilities in Houston, TX.
If you’re not able to attend the event, but would like to learn more about any of the topics listed above or more about our OneStream practice, please reach out to us.
]]>Orphan members in OneStream are the members with no parent. Due to this, they are bit difficult to locate using Search Hierarchy feature, since technically they do not sit anywhere in hierarchy. They even do not get captured in grid view. Sometimes, an organization might want to delete them as they are no longer required or maybe align them back into appropriate location in hierarchy. This blog focuses on simple Excel & Notepad++ based techniques, to populate the list of orphan members, and deleting them (if required).
Technique shared in the blog requires 2 software:
Notepad++ is an open-source software available free of cost. Microsoft 365 version of Excel is required as this technique uses a function named TEXTSPLIT( ) which was rolled out for 365 version of Excel. Alternatively, Excel for Web can also be used which is always updated, and available via free sign-up of Microsoft Account.
Below is the screenshot of UD1 hierarchy for Product brand-wise.
100+ members have become Orphans node, as their relationship was removed to park them outside.
These members were created to incorporate entire catalog but were never purchased / sold. Organization wants to delete them permanently to keep data lightweight for better performance. Deleting 100+ members one-by-one would be herculean task, wasting hours of efforts. Let’s see some simple hacks, to populate list of orphan members. Once the list gets populated, those members can be deleted or re-aligned as desired, using Load/Extract feature.
Following are the steps to derive & populate orphan member list:
The orphan members identified in above steps, can be deleted (if required), by a simple Excel-based hack. Below are the steps:
During the implementation of OneStream application, Implementation Consultants and Administrators create different security groups based on the client’s security access requirements. However, as the project progresses from one phase to another, these security groups may become obsolete or redundant.
Security groups are assigned to different objects in the application to enable users to access certain sets of data or reports, or execute tasks depending on the data access required to perform their tasks.
This article aims to help you identify the security groups assigned or used or not used within the OneStream application.
If a security group is not assigned to a user or a parent group, it denotes that the group is not being utilized in the application.
Security groups are usually assigned to various objects within the OneStream Application, as listed below:
Security can be defined using several methods, such as Security Roles, Entity Security, Cube Security, and Workflow Security. However, to run a report, security group assignment must be applied to Cube View Profiles and Dashboard Profiles. Confirming security group assignment requires the Administrator to check all of the above, and it can be time-consuming to find where a security group is assigned. To simplify this search, the Administrator can use the following workaround or method:
Each Security Group assigned will appear in the above group assignments. If a group does not exist in any of the above thirteen application setups, then one can safely assume that the security group in question is redundant and not being used. Before deactivating a group, make sure the group is unassigned from users and then take proper action to deactivate.
Note:
Please follow these guidelines when defining security groups:
One of our client requirements was to post a multiperiod journal. In the Consolidation process, this kind of request is uncommon; however, this request was from a Planning process perspective. In the following document, I describe the steps to create a multiperiod journal with OneStream XF.
Navigate to and select Application -> Data Collection -> Journal Template and create a journal template.
The first step is creating a Journal Group to organize the journals better. Click on the two blue dots to create a Group.
Add a name and description, define the security (Access Group and Maintenance Group), and save.
Then, we need to create the journal template. Click on the “Create Journal template” icon.
Add a name and description, and fill out the journal template depending on the requirements. Choose the journal template type. “A Journal template can either be a Standard or Auto Approved. If it is auto-approved, a user can create a Journal from the template with limited editing ability, meaning permission to change name, description, and so forth.1”
For illustration purposes, we left the journal template Type as “Auto Approved.”
To add a dimension, go to the Point of View section. Select the dimension you want to add, and click on the ellipsis.
Select (Default) member and click “OK,” and repeat for each dimension.
After the selections, you can see the dimensions displayed in each journal line.
Once we have created the Journal template, we must create a Journal Profile. For that, click on the three dots icon.
Add a name and description, define the security (Access Group and Maintenance Group), and save.
Once we have created the Journal Template profile, we need to add the journal template to the Journal profile. Select the Profile and click the “Manage Profile Members” icon to do so.
In the profile builder, select the group you want to add to the Profile and click “Add item.”
Once you have added the group, click “OK.”
Now, you will see the journal template in the Profile.
Navigate to and select Application -> Workflow -> Workflow profile and select the Workflow “WF” where you want to create the journal template profile.
Once you select the WF, activate the step by selecting TRUE in “Profile Active.” In the Setting section of the WF, assign the Journal Template Profile Name to the WF on the “Journal Template Profile Name” line in the Journal Settings section.
Open an MS Excel file. In the first two rows of Column A, create two Row Type Parameters specifying the Header and the Detail !RowType (H=Header) and !RowType (D=Detail). These tags the corresponding rows with H or D, identifying the Header and Detailed information in the CSV file.
Then, we need to enter the Journal Column Headers. The required headers are the following (make sure you put each title in one specific cell):
JournalName, OriginiatingTemplateName, JournalDescription, JournalType, JournalBalanceType, IsSingleEntity, EntityMemberFilter, ConsName, WFProfileName, WFScenarioName, WFTimeName, and CubeTimeName.
Add the detail in row two, and enter the Journal Detail Headers. The required Journal Detail Headers are:
JournalName, CubeName, EntityName, ParentName, AccountName, FlowName, ICName, all UDNames, DebitAmount, CreditAmount, and LineDescription.
The file will identify the rows depending on the first column tags. H = Header or D = Detail.
NOTE: You must add an H Row (Header) for each journal you want to post.
In this case, I want to affect two periods (2021M6 and 2021M7).
Once you have created the MS Excel file, save it as CVS (Comma Delimited).
Go to OnePlace and select the WFProfile, Scenario, and Time member to load the journal. In this case, FIN_Admin, Budget2021, 2021
Create the journal using the .cvs file.
Select the journal CSV file to load (make sure you can see “All files.) and click Open.
Once you load the .cvs file in the journal step, OneStream creates two journals.
Click the check box to Select ALL journals, and click on post.
You will be able to post all the journals with a single post.
Note: Depending on security, you could create, upload, and post the journal, or different persons can manage this separately, e.g., one can upload, and another can post it.
Click “OK” to confirm the posting.
Create a CubeView “CV” or QuickView to validate data. In this case, I used a CV.
One of the most common if not the most common requests when setting up OneStream is to have a direct connection from an ERP such as Oracle, Peoplesoft, and/ or Netsuite to OneStream to allow for data to be automatically uploaded on a regular basis. Below is a link from one of my fellow Perficient consultants that describes the steps and how to setup a Direct connection to OneStream:
Oracle EBS Direct Connect Configuration in OneStream / Blogs / Perficient
This blog is going to focus on the steps that can be done before setting up the direct connection to OneStream and should be done before creating the connector business rule. This blog is also primarily focused on an ODBC/ SQL connection. If you follow these steps the direct connection would be setup in the shortest time possible with the lowest possible consultant hours/ dollars spent:
Steps 1 and 2 do not require a OneStream consultant and should be done by someone who is an expert with that ERP. Once these steps are done, you are ready to setup the connection to OneStream with a connector business rule, transformation rules, workflow and load the data (see steps 4 to 8 of the attached blog).
If the ERP database supports SQL, then the query would start with a SELECT statement that picks the fields needed and possibly some JOIN statements FROM a table or multiple tables with a WHERE clause.
For example, if all of the data is in 1 table the query might look like this:
SELECT
Entity,
Account,
Period,
Department,
Project,
Amount
FROM
Gl_table
WHERE
Period = ‘Jan 2023’
The results of a query like this may produce a table similar to this:
Entity Account Period Department Project Amount
111 10000 Jan 2023 200 155 1,000.00
111 30000 Jan 2023 200 NA -1,000.00
Most likely your ERP has multiple tables. In that case tables would have to be joined. The queries that I have seen had multiple Inner, outer and Left Joins. That is why it is so important to have someone that knows your ERP’s tables create the query needed.
Have someone create the necessary query from your ERP using Microsoft SQL server or some other program that is available to query the ERP. This should be done by either someone in your IT department who is very familiar with the tables in your ERP or a consultant who knows the tables in your ERP. Someone in your IT department is not only the least expensive option but is also often the best qualified to create the query.
If the data is a small enough file have it sent as a comma delimited file and compare to the trial balance or reports currently being used in your ERP. The data needs to tie exactly to your reports. That way you can be sure that the data that will be going to OneStream is correct. This way if there is a difference between OneStream and your ERP then it is not the query but something with the OneStream setup. If you skip this step, then how can you be sure if the differences you have are the query or the OneStream setup?
There are several ways to setup the direct connection:
Setting up the SQL data adapter takes very little time and can even be put in a dashboard so the data can be downloaded. Here are the steps in OneStream:
Create a blank dashboard maintenance unit:
I created a new dashboard maintenance unit, called it “Test_Data_adapter” and it automatically adds all the types of objects needed to do a dashboard:
Add a Data adapter for your connection (this example has a named connection called “Netsuite”):
Copy your SQL into the data adapter:
Run the SQL:
Create a Grid View component:
Add the SQL data adapter:
Create a new dashboard and add the Grid View component:
View dashboard:
Export the data so that it can be tied:
This setup should take less than 1/2 hour to do. Once this data has been checked, you’re ready to create the connector business rule and data source.
]]>Last week we published the first blog post in this two-part series sharing highlights of Splash, OneStream’s Global User Conference and Partner Summit! We enjoyed four days of best practices, product updates, networking, and hands-on workshops with finance and industry experts!
We’ll pick up where we left off with comments from the Perficient team about highlights from our time at this inspirational event.
Attending the OneStream Splash conference was very exciting and a valuable and informative experience. OneStream’s unified platform streamlines financial consolidation, reporting, and planning processes, while providing valuable insights. The conference provided a roadmap for success by showcasing successful OneStream implementations and highlighted how to avoid common pitfalls. It was also an opportunity to connect with like-minded professionals and OneStream and industry experts. Finally, the conference provided a glimpse into the future of OneStream’s platform.
At this conference, there were several technical sessions for those looking to improve their OneStream application. One of the highlights of the conference was learning about how OneStream can be implemented for those looking to expand their ESG reporting capabilities. As ESG reporting becomes increasingly important for businesses, many organizations are seeking to effectively manage the process. In a presentation titled How to implement ESG and Align with Financial, attendees were able to learn how OneStream Software can help conquer ESG reporting. Here are the three key takeaways from this informative presentation:
ESG reporting can be a complex and time-consuming process. The platform is designed to help organizations manage ESG data and streamline the reporting process. OneStream Software allows businesses to collect data from various sources and auto-populate the necessary reports. This can save businesses time and resources, while ensuring accuracy in reporting.
Transparency is a key aspect of ESG reporting, as stakeholders expect businesses to provide clear and accurate information about their environmental, social, and governance practices. OneStream Software provides transparency by allowing organizations to track and report on ESG metrics in real-time.
It also provides businesses with the tools and insights they need to make informed decisions about their operations. The platform allows organizations to track and analyze data over time, identifying trends and patterns that can inform strategic decision-making.
Overall, attending Splash was a valuable experience that provided knowledge, insights, and networking opportunities. Attendees also connected with like-minded professionals and OneStream experts, expanding their networks, and learning from industry experts.
It was a great week at the OneStream Splash 2023 Conference in National Harbor, Washington, DC! After a three-year hiatus, having the Splash Conference back in full swing was very exciting. I know my Perficient colleagues, and I enjoyed meeting each other in person for the first time and conversing with potential and current customers. The annual user conference brings together OneStream users, partners, and experts worldwide for three full days of learning, networking, and fun. A few of the most valuable aspects of OneStream Splash are the opportunity to connect with other OneStream users and experts, share our experiences, learn from each other, and build lasting relationships with individuals in the CPM realm.
My favorite breakout session that I attended was Analytic Blend – Simplifying your OneStream Application while making the granular Details Available When Needed. OneStream Analytic Blend is a powerful tool that allows an organization to combine data from multiple sources, including OneStream cube data and external data sources such as ERP systems, data warehouses, and other systems. The breakout session covered the importance of utilizing Analytic Blend/Services to hone in on granular-level detail, such as product/SKU data. It also dove into some best practices for building analytical blend services.
I want to highlight a few of the key benefits of using the analytical services that OneStream provides:
First, it enables organizations to perform complex analytics on blended data, such as multidimensional analysis, ad-hoc reporting, and visual data exploration, all while maintaining a single source of truth system. This enables organizations to uncover insights and identify opportunities for improvement.
Second, unlike some ERP and CPM solutions, OneStream supports advanced data modeling capabilities, including creating alternate hierarchies, mappings, and calculations. This makes it easier to manage complex data structures and perform calculations on blended data.
With the ability to gain insight into this granular level of detail and report on KPIs that would otherwise be nearly impossible to report on in a timely manner, the decision-makers in an organization are able to view data in real-time and make more informed decisions, which can lead to improved performance and increased profitability.
Lastly, as AI and machine learning expand across companies, the analytical blend services that OneStream provides serve as a foundation for further enhancements.
Overall, OneStream Splash is a must-attend event for anyone who uses OneStream or is interested in learning more about CPM. The conference offers a unique combination of education, networking, and fun, all in a beautiful location. If you want to take your skills and knowledge to the next level, consider attending OneStream Splash 2024 in Las Vegas. We hope to see you there!
]]>So much time is spent planning for Splash, OneStream’s Global User Conference and Partner Summit and then it feels like it’s gone in the blink of an eye! Four days of best practices, product updates, networking, and hands-on workshops with finance and industry experts – what a fantastic event!
The Perficient team was in full force at Splash. We had two speaking sessions, one virtual and one in-person. This year we introduced OneSteamers to our partnership with RioBotz and their BattleBots competitor, Minotaur. Show attendees were able to engage in battle with Minotaur and its rival and seven lucky winners of Hexbug Rivals Kits were able to Take the Battle Home! As we look back on our time at Splash, the team wanted to share their thoughts with our readers. If you didn’t get a chance to attend Splash or even if you did, I’m sure you’ll enjoy highlights from the team.
I really loved the venue for Splash 2023! The Gaylord Hotel and Convention Center at National Harbor was positioned in a lovely location with views of our nation’s capital. Food, entertainment, and additional accommodations were within walking distance.
Sharing the HNI success story at Splash with two finance professionals from HNI, Terra Simpson and Jennifer Curry was a highlight. More than 100 attendees packed the room to learn how HNI used the OneStream RCM Account Reconciliations solution, along with an enhancement developed by Perficient, to identify Cash Flow adjustments using Reconciling Item Types and automate the posting of these adjustments to Actuals.
My favorite takeaways from the week were from Designing for Performance and System Diagnostics. In Designing for Performance, the team laid out why the initial dimension design is so crucial. Potential Data Unit size is determined by your dimensions and their members. They made a strong case for the use of Extensibility to optimize performance. At the other end of the spectrum, the System Diagnostics highlighted ways to identify opportunities to improve application performance. The new System Diagnostics MarketPlace Solution deployed in early 2023 highlights the Top Application Metrics to focus on for improved performance. Guess which was number 1? Data Unit Size! I could talk about this for days… Actually, I am talking about this for 2 days in May and would love to see you in the upcoming OneStream Level 2 Financial Model Rules course on May 23 and 24.
As a first-time conference attendee, I found it extremely informative and an overall great experience. I would recommend trying to attend at least once, both from a developer or customer standpoint. There are a variety of sessions and topics that will satisfy both beginners to advanced users. The opening keynote was a highlight for me and a great way to kick off the week as I was able to reminisce about the growth of OneStream and be inspired by future plans.
A few memorable takeaways come to mind as I think about Splash:
First off the venue was great. All in one place and very convenient. The evening event Perficient held at Harbor Social was a big success and a lot of fun. What I really liked most about Splash was the combination of seeing old colleagues, meeting new customers, and bonding with the Perficient team. Working in a remote environment you lose that face-to-face contact and this helped to reinforce relationships.
Regarding Splash content, I really liked the idea of incorporating AI and Machine learning into the OneStream ecosystem. I see it not only from an end product perspective, i.e., more dynamic and relevant forecasting but also from the perspective on how it can be used to make our jobs more efficient. I also learned a lot about ESG reporting. To me, I can see how this will generate a lot of opportunities for us and will continue to advance over time as this reporting becomes more defined and incorporated into corporate reporting. They made note of how companies will begin to report and forecast ESG along with their quarterly financials eventually blending the two together where they can gleam opportunities to improve their ESG ratings over time.
Lastly, I found it very informative regarding the future of the OneStream ecosystem. Moving to an open marketplace where partners can share and provide application enhancements presents an opportunity for Perficient to differentiate ourselves. All in all, I had a great time at Splash, learned a lot, and look forward to going again.
]]>OneStream has a Marketplace Solution called Table Data Manager that allows users to create custom tables where data can be loaded and updated. See also this Perficient blog post on how to get started with Table Data Manager:
OneStream – Table Data Manager / Blogs / Perficient
Table Data Manager has a function/ button to import data into custom tables. However, the import function in Table Data Manager only accepts files in XML format. In this blog post I will show you how to load data to a custom table using a flat file such as a comma delimited (.csv) file instead of having to use XML format.
Using the Table Data Manager instructions from the blog linked above I created this XFC_Test table:
I created this flat file and saved as a comma delimited file:
I then loaded the file to File Explorer under File Share/Applications/NHLBI_Blog_copy/Batch/Harvest:
Note: NHLBI_Blog_copy was the name of my application. Everything else would stay the same. The file was loaded here since we won’t have to be concerned about OneStream having access to this folder.
I created this business rule to upload the comma delimited file:
Then I made it an Extensibility rule and made it an unknown type so it could be run from the Business rules page:
I developed this code using the Custom Table Load (Delim) Snippet in OneStream:
To get the file path (line 34) I used the GetFileShareFolder snippet:
From the business rule click on Run:
The business rule uploads the data and you will get a pop-up message that the business rule ran successfully:
Go back to Table Data Manager and View the data:
The column names in the business rule must match the column names in the custom table exactly. Lines 39, 40 and 41 had the exact same column names in brackets [ ] as the column names in the custom table:
The order of the columns in the business rule must match the order of the columns in the flat file. For example, the first column referred to in the business rule was Department and the 1st column in the flat file was also Department:
The business rule will replace all the data since the load method is set to replace:
]]>
Perficient was engaged with a client that wanted to be able to drill down from a summary report into the details in their Thing Planning database. This was done in OneStream using linked cube views and bound parameters. The project team created 2 separate drill downs:
The project team set this up using:
Both drill downs were done using Bound Parameters in the Cube View. Bound Parameters are setup in the cube view under General Settings/navigation links.:
Bound Parameter names are entered under each dimension that needs to be drilled down on. Any combination of numbers, letters and underscores (_) can be entered as a bound parameter name (No special characters or spaces). For example, the project team came up with “ParentClick_UD1” for the UD1 dimension:
Select the cube view to link to in the Linked Cube Views section:
The Bound Parameter is used as a parameter in the linked cube view by enclosing the name with solid lines and exclamation points (|!). The project team put the parameter on the Point of View (POV) in the linked cube view:
Perficient was recently engaged with a client that was implementing a Thing Planning solution to manage thousands of line items for their planning process. The client wanted to be able to look at a small set of data in a form and submit the changes to just those rows without having to go through the entire workflow process. Their items also had 7 years of data with a row for each year and needed to pivot on the years so that they were in columns.
Here’s a sample of some dummy data in Thing Planning:
Table Viewer in OneStream allows you to bring this data into a Spreadsheet and change the data. There are 3 main functions of the Spreadsheet business rule:
The values we wanted for those variables were entered in those named ranges. If named ranges were not used for the variables then there would be a pop-up box for the user to enter the values for the variables.
]]>
Coming from IBM Planning Analytics (TM1) and Anaplan background OneStream’s first-hand experience was quite distinctive and interesting.
There are no prerequisites to have experience in any EPM tool but it provides a slight edge.
Anyone with technical / finance background can learn this tool.
As compared to other EPM tools OneStream has the unique capability of both Financial Consolidation and Planning in the same platform.
I started initially with OneStream Essentials-Getting Started with OneStream. This module provides basic information about the tool as well capability of the tool. This model was helpful to continue the certification journey.
Below are 3 certifications/badges for the beginner level. All this course needs is between 20-30 hours to complete.
One of the main benefits I found through taking this course was to follow best practices. who earned this certification have demonstrated the knowledge to analyze customer requirements and initiate the design of a OneStream application.
Additionally, earning OneStream Badge demonstrates key industry knowledge to customers and can help differentiate employees in today’s competitive world.
This module covers key components of tools and builds the application along with best practices. I spent around 3 hours per day absorbing the concept. This model is broken out into multiple chapters. I was able to cover 1-2 chapters each day. Each chapter has a hands-on Lab and test which needs to be completed before moving on to the next chapter. Over 12 assessment tests are included in these modules. The questions can be a little tricky, but completing LAB / hands-on exercises helped me a lot to tackle the questions.
Learning Path provides students with access to metadata, workflow, and model design-related courses. Using a variety of interactive tools, students can navigate through a series of lessons and activities that enable them to comprehend the key elements of a project’s design phase. This module was made available only after clearing the pre-assessment test. The tests followed by each chapter consist of 12 assessment tests.
This module covers Integrating with different data sets. The module contains 16 assessment tests.
Once the assessments are completed for each module you will receive your Badge through Credly within a week of completion, which can be shared on social media.
All the Best!
Sachin Bongale
Badge URL:
https://www.credly.com/badges/364e5096-edc4-4306-bb9c-a4280b387710/public_url
I have 15 plus years of experience in EPM domain working in end-to-end implementation of tools like IBM planning analytics, PBCS/EPBCS, and Anaplan.
I am a certified Scrum Master, PMP, OneStream, Hyperion EPM, Anaplan, and IBM Planning Analytics.
Certified Chartered Accountant with strong functional knowledge of financial statements, budgeting, forecast & planning processes.
]]>
OneStream supports exporting metadata into XML file for backup and restore purpose (via menu Application > Tools > Load/Extract). This blog covers technique to extract this information from metadata XML using technology named XSLT (eXtensible Stylesheet Language Transformation), which can read XML hierarchy & extract information from it.
Microsoft Visual Studio supports creating/editing XML & XSLT files, with in-built intellisense (auto-complete) and a validator, which checks for correctness of XSLT file. Visual Studio comes with XSLT processor for handy XML transformations which developers might require. Microsoft offers Community Edition of Visual Studio, available freely, suitable for light-weight development & tasks.
Below is the demo Account Member hierarchy which we shall be extracting from XML (screenshot below)
Below is the screenshot of Metadata XML as appearing in Visual Studio, extracted via Load/Extract menu
XML file contains hierarchal data. Querying tree structure data is a tricky task compared to tabular data, which can be queried easily using SQL. XPath is used to query XML data.
Let’s see, we want to query description value of Account member 1001. Below will be the XPath expression for this
/OneStreamXF/metadataRoot/dimensions/dimension[@type=’Account’]/members/member[@name=’1001′]/@description
XML tags are represented by /tag and XML attributes by @attribute. XPath supports filtering of data, by specifying query condition in square braces for that tag.
XSLT is used for querying & transforming data from XML file, and generating output in XML or text format. XSLT is written using XML itself, with few XML tags instructing how to transform data. Visual Studio ships XSLT processor, capable of executing on-the-fly transformations via GUI menu. Below is the demo XSLT file, which extracts data from above XML file & generates textual output in Tab delimited format, which can be dumped into Excel or even imported into SQL Server easily.
XSLT can be copy-paste from below
<?xml version="1.0" encoding="utf-8"?> <xsl:stylesheet version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform" xmlns:msxsl="urn:schemas-microsoft-com:xslt" exclude-result-prefixes="msxsl" > <xsl:output method="text" indent="yes"/> <xsl:template match="/"> <xsl:text>Dimension	Name	Description	Account Type
</xsl:text> <xsl:for-each select="OneStreamXF/metadataRoot/dimensions/dimension[@type='Account']/members/member"> <xsl:value-of select="../../@name"/> <xsl:text>	</xsl:text> <xsl:value-of select="@name"/> <xsl:text>	</xsl:text> <xsl:value-of select="@description"/> <xsl:text>	</xsl:text> <xsl:value-of select="./properties/property[@name='AccountType']/@value"/> <xsl:text>
</xsl:text> </xsl:for-each> </xsl:template> </xsl:stylesheet>
Let’s understand various parts of XSLT:
<xsl:output method=”text” indent=”yes”/>
This instruct XSLT to generate output in textual format
<xsl:text>Dimension	Name	Description	Account Type
</xsl:text>
This line will insert a static column header in output file. XSLT being XML internally, needs escaping of tab (	) & newline (
)
<xsl:for-each select=”OneStreamXF/metadataRoot/dimensions/dimension[@type=’Account’]/members/member”>
Above XSLT line, runs a for-each loop of all the member under the dimension which are of type Account.
<xsl:value-of select=”@description”/>
This line emits content of description attribute of member tag from XML.
<xsl:value-of select=”../../@name”/>
../ is XPath expression to fetch value of relative parent. So we are going 2 levels up, to then dimension XML node and then extracting value of name attribute from it.
<xsl:value-of select=”./properties/property[@name=’AccountType’]/@value”/>
./ is XPath expression to extract values from relative child XML node.
Steps to generate textual output file
XML is globally used for data interchange, enjoying compatibility with majority of software. OneStream highly leverages XML to backup almost every object / artifact available in the system. Objective of this blog is not just to perform specific task covered in case study, but to gain basic understanding of concept of XPath & XSLT. With a good command over XSLT, one can apply this technique even to re-create XML files with bulk modifications. Endless possibilities exists with varied business use cases one can think of.
]]>