XML Articles / Blogs / Perficient https://blogs.perficient.com/tag/xml/ Expert Digital Insights Thu, 28 Jul 2022 15:10:25 +0000 en-US hourly 1 https://blogs.perficient.com/files/favicon-194x194-1-150x150.png XML Articles / Blogs / Perficient https://blogs.perficient.com/tag/xml/ 32 32 30508587 Implementation of XML Extract in SSIS https://blogs.perficient.com/2022/07/27/implementation-of-xml-extract-in-ssis/ https://blogs.perficient.com/2022/07/27/implementation-of-xml-extract-in-ssis/#comments Wed, 27 Jul 2022 17:51:46 +0000 https://blogs.perficient.com/?p=314718

SQL Server Integration Service (SSIS) :

SSIS is a component of the Microsoft SQL Server database software that can be used to execute a wide range of data migration tasks. SSIS is a fast & flexible data warehousing tool used for data extraction, loading and transformation like cleaning, aggregating, merging data, etc.

It makes it easy to move data from one database to another database. SSIS can extract data from a wide variety of sources like SQL Server databases, Excel files, Oracle etc. In this blog we are going to see how to implement XML Extract in SSIS.

Intro

Control Flow:

    A Control flow defines a workflow of tasks to be executed, often in a particular order. Control flow makes it possible for the package to extract, transform, and load data

Data Flow:

The Data Flow task encapsulates the data flow engine that moves data between sources and destinations, and let the user transform, clean, and modify data.

XML Extract:

The XML Extract Component is an SSIS transformation component that receives an XML document from an upstream component and extracts data from the received XML documents and produces column data for the SSIS pipeline.

 

Let us illustrate this with an example.

XML Extract Component of the SSIS KingswaySoft. In this the source will be the procedure from where the data will extracted in the xml format and destination will be the table.

1

Step 1 – First drag and drop data flow task from the SSIS Toolbox in the control flow window.

2

Step 2 – Next, Double Click on the Data flow task from Step 1 and in the below screenshot given drag the OLE DB Source component it will call the procedure that will generate the output in the xml format.

 

3

Source

4

 

Step 3 Designer page allows you to build the design of the document you are trying to extract. Now, the source will be connected to the XML Extract component which produces column data for the SSIS pipeline.

DESIGN

6

COLUMNS

5

In the above image the column page shows the available columns from the Design Page. The Columns Page grid consists of:

  • Column Name: Column that will be retrieved from the document.
  • Data Type: The data type of this field.

Hide Unselected Fields

  • When the Hide Unselected Fields checkbox is checked unselected output columns will be hidden.

Hide Selected Fields

  • When the Hide Selected Fields checkbox is checked used selected columns will be hidden.

There are a couple of special columns to take note of:

  • _RowIndex: This column contains the current count of this output element.
  • _ParentKeyField: This column contains the value of this records parent key field.

ERROR HANDLING

In this there are 3 options for error handling

  1. Fail on Error
  2. Redirect rows to error output
  3. Ignore error

For this example we selected Fail on error option. If the package fails it will show the error message.

7

 

Step 4 – There are 3 output from the xml extract Order, Order Line, OverchargeTotal from which we need the specific columns from order and overchargetotal to be merged and show in one destination so, we used Merge join and Sort Transformation

8

9

 

Step 5 – Drag and drop the OLE DB Destination component and specify the target table in which the data needs to be loaded.

10

11

 

Now, The xml message is successfully loaded in the column format in the target table.

13

Here we learned that how to implement the XML Extract component in SSIS.

Refer to the official KingswaySoft documentation here if you want to learn more.

for more such blogs click here

Happy Reading!!

]]>
https://blogs.perficient.com/2022/07/27/implementation-of-xml-extract-in-ssis/feed/ 6 314718
Episerver – Handling an SVG image type and displaying it using a Path tag https://blogs.perficient.com/2020/08/28/episerver-handling-an-svg-image-type-and-displaying-it-using-a-path-tag/ https://blogs.perficient.com/2020/08/28/episerver-handling-an-svg-image-type-and-displaying-it-using-a-path-tag/#respond Fri, 28 Aug 2020 14:18:41 +0000 https://blogs.perficient.com/?p=280571

Recently, I came across a requirement from a client that needed all SVG media types to render using the internal XML <Path> in the DOM, rather than using the standard <Img> tag.

This was a tricky one and there wasn’t very much information about how to accomplish this, as its not necessarily a common request. That being said, it does have its advantages, such as recoloring the vector using CSS.

In this post, we’re going to go through the following:

  • Creating a new Media Descriptor that will handle the SVG file format
  • Looking at how SVGs are stored
  • AzureBlob (DXP) vs FileBlob (Local)
  • Extracting the XML and returning it

You can view this article over on my blog website.

Although this works, it doesn’t mean this is the best way of doing it.

 

Please let me know your thoughts! I would love some feedback!

]]>
https://blogs.perficient.com/2020/08/28/episerver-handling-an-svg-image-type-and-displaying-it-using-a-path-tag/feed/ 0 280571
Update Member Relationship in OneStream XF https://blogs.perficient.com/2020/07/08/update-member-relationship-in-onestream-xf/ https://blogs.perficient.com/2020/07/08/update-member-relationship-in-onestream-xf/#respond Wed, 08 Jul 2020 13:50:36 +0000 https://blogs.perficient.com/?p=276952

In my previous blog post, I demonstrated how members meeting the appropriate criteria could be deleted from an OneStream XF application with a XML file.  In this blog, I will demonstrate how a member relationship “parent / child” can be deleted and more importantly changed using a XML file which is a necessary part of the process when creating an automated dimension build.

The approach presented will be to create one XML file to delete the member relationship and then create a second XML file to add the updated relationship for the member(s).  This method will take an input file and generate the necessary XML syntax for both files.  All you need is access to Microsoft SQL Server with the appropriate permissions.  This solution is applicable for other dimensions Entity, UD1-UD8 and can be re-used whenever the need arises if properly setup.  The complete approach assumes that:

  • The parent and child members are members of the application.
  • You have already identified which members require an updated relationship e.g. new parent.
  • You have a list of these members in a text file or format that can be imported or queried. An example of a source that could be queried would be a Data Warehouse table which stores a parent / child structure that is used to derive a hierarchy.

If you don’t have a list or applicable source, you can use a Cube View, Quick View, or another method to generate the list.  The next step will be to import the list of accounts which require an updated parent into a Microsoft SQL Server database using the standard flat file import.

Blog02 01

Blog02 02

Blog02 03Blog02 04Blog02 05

For reference, a sample of the imported data is displayed.

Blog02 06

With the file successfully imported, the next step will be to create a SQL statement that will output an XML file in the correct format to delete the existing relationship.  The format is shown below.

Notice that each <relationship> element has three attributes:  parent which will be the parent member name, child which will be the child member name and action which is specified as =”Delete”.  The action instructs the import process to delete the relationship defined with the parent and child attributes.  If the child has more than one parent only the relationship defined will be deleted “removed” with the other relationship(s) retained in the approach.  In the circumstance of a single parent, the child member will be listed in Orphans of the respective dimension.

Blog02 07

The next query creates the XML file.  It has three parts:  a SELECT clause, a FROM clause, and a FOR XML PATH clause.

The SELECT clause “SELECT [Parent] AS [@Parent], [MemberName] AS [@child], ‘Delete’ AS [@action]” will order the results as attributes of the <relationship>  element.  The ampersand “@” indicates this is an attribute of an element instead of a row element.

The FROM clause is simply the same SQL statement from earlier that returns our list of accounts.

The FOR XML PATH clause specifies the format of the output. It creates a root element called <relationships> and each row in our list will become an <relationship> element.

Blog02 08

Blog02 09

The remaining required XML elements are displayed in the next image on lines 2-5 and 9-12.

Note: the <dimension> element has additional attributes which are not displayed. 

Blog02 10

To make things easy, we just overwrite line 7 in the first image with our XML results.  The second image displays the file after the copy and paste operation.  Save this new file as Step01Orphan with the extension .xml.

Blog02 11Blog02 12

Before we import our XML file,  we demonstrate in the next image that our OneStream application has an Account dimension currently with no Orphans and the members which start with A100 are children of Assets which are the members which will have the relationship changed.  With the first step, the members will be orphaned.

Blog02 13

Now, log into the OneStream application.  Select and expand Application > Tools > Load/Extract.

Navigate to and select the XML file previously created.  Once this is done, select the Load icon.  The load will complete without error which is confirmed by the second image after this paragraph.

Blog02 14

Blog02 15

To confirm the upload did delete the relationship, close all Pages and then refresh the OneStream application.  Once the application refresh completes, navigate to the Account dimension and expand Orphans.  The account members displayed in the XML file will be listed as Orphans after a successful import.

Blog02 16

Next, the updated relationship for the members will be imported.  This query has the same syntax as the query used to delete the relationship with the following changes:  the action attribute was removed as the delete of the relationship does not need to occur, an aggregationWeight attribute with a value of 1.0 is added which is the default when keying a relationship and the hard coded Parent of Assets is changed to CurAssets which is the updated relationship.

Blog02 17

Blog02 18

To continue the make things easy approach, save file Step01Orphan.xml as Step02AddParentToOrphans.xml.  Once this is done, line 6-17 can be replaced with the XML created to add an updated member relationship.  The second image displays the file after the copy and paste operation.  Please save the updated file as Step02AddParentToOrphans.xml

Blog02 19

Blog02 20

Return to the OneStream application, select and expand Application > Tools > Load/Extract.

Navigate to and select the Step02… XML file previously created.  Once this is done, select the Load icon.  The load will complete without error which is confirmed by the second image after this paragraph.

Blog02 21

Blog02 22

To confirm the upload did update the relationship, close all Pages and then refresh the OneStream application.  Once the application refresh completes, navigate to the Account dimension and expand CurAssets.  The account members displayed in the XML file will be listed as children of this member after a successful import.

Blog02 23

I hope this solution is helpful if you are interested or have begun the process of automating an OneStream dimension build.  If you have any questions, feel free to either post a comment to the blog or email me at terry.ledet@perficient.com.

 

]]>
https://blogs.perficient.com/2020/07/08/update-member-relationship-in-onestream-xf/feed/ 0 276952
Mass Delete Members in OneStream XF https://blogs.perficient.com/2020/06/10/mass-delete-members-in-onestream-xf/ https://blogs.perficient.com/2020/06/10/mass-delete-members-in-onestream-xf/#comments Wed, 10 Jun 2020 14:45:44 +0000 https://blogs.perficient.com/?p=275718

During one of my previous projects, we needed to delete approximately 200 accounts that were distributed throughout a large Account dimension. This blog discusses how we solved it.

One approach would have been to delete each account one time. Obviously, that wasn’t appealing as it is time-consuming and can be prone to error.  Another approach is discussed in OneStream knowledge base article KB0010541, which provides a method to create an XML file for one or more members. However, the KB method requires typing the member name one at a time.

Our approach was to create an XML file to mass delete the members.  This method will take an input file and generate the necessary XML syntax.  All you need is access to Microsoft SQL Server with the appropriate permissions.  This solution is applicable for other dimensions UD1-UD8 and can be re-used whenever the need arises if properly setup.  The approach assumes that:

  • All the members are base members
  • You have already identified which members need to be deleted
  • You have a list of these members in a text file or format that can be imported
  • The members identified are not a member in a Journal template

If you don’t have a list, you can use a Cube View, Quick View, or another method to generate the list.  The next step will be to import the list of accounts into a Microsoft SQL Server database using the standard flat file import.

Blog01 01

Blog01 02

Blog01 03

Blog01 04Blog01 05

For reference, a sample of the imported data is displayed.

Blog01 06

With the file successfully imported, the next step will be to create a SQL statement that will output an XML file in the correct format.  The format is shown below.

Notice that each <member > element has two attributes:  name which will be the member name to delete, and action which is specified as =”Delete”.  The action instructs the import process to delete the member.

Blog01 07

The next query creates the XML file.  It has three parts:  a SELECT clause, a FROM clause, and a FOR XML PATH clause.

The SELECT clause “SELECT [MemberName] AS [@name],’Delete’ AS [@action]” will order the results as attributes of the <member>  element.  The ampersand “@” indicates this is an attribute of an element instead of a row element.

The FROM clause is simply the same SQL statement from earlier that returns our list of accounts.

The FOR XML PATH clause specifies the format of the output. It creates a root element called <members> and each row in our list will become an <member> element.

Blog01 08

Blog01 09

The remaining required XML elements are displayed in the next image on lines 2-5 and 10-13.

Note: the <dimension> element has additional attributes which are not displayed. 

Blog01 10

To make things easy, we just overwrite lines 6-9 in the first image with our XML results.  The second image displays the file after the copy and pastes operation.  Save this new file with extension .xml.

Blog01 11

Blog01 12

Before we import our file,  we check that our OneStream application has an Account dimension with 10,367 base members.  We are going to delete 190 of these with our file.

Blog01 13

Now, log into the OneStream application.  Select and expand Application > Tools > Load/Extract.

Navigate to and select the XML file previously created.  Once this is done, select the Load icon.  The load will complete without error which is confirmed by the second image after this paragraph.

Blog01 14

Blog01 15

To confirm the upload did delete members, refresh the OneStream application, and then navigate to the Account dimension and Grid View.  The number of base members has updated from 10,367 to 10,177 which confirms 190 members were deleted.

Blog01 16

I hope this solution is helpful the next time you have to delete numerous members.  If you have any questions, feel free to either post a comment to the blog or email me at terry.ledet@perficient.com.

]]>
https://blogs.perficient.com/2020/06/10/mass-delete-members-in-onestream-xf/feed/ 2 275718
Maximo Hover Dialogs – How to Modify Fields https://blogs.perficient.com/2018/03/16/maximo-hover-dialog-add-remove-fields/ https://blogs.perficient.com/2018/03/16/maximo-hover-dialog-add-remove-fields/#respond Fri, 16 Mar 2018 20:24:11 +0000 https://blogs.perficient.com/ibm/?p=9902

As clients look to extend and refine the use of Maximo, one of the useful newer tools is the Maximo hover dialog information. They are not difficult to modify and can give you a quick way to view the items that are important to your business.

To get started, we will first take a look at the fields that utilize the hover dialog. The easiest way to know which fields have the option is to look for the blue “more information i” next to the fields:

The following fields have hover dialogs out of the box:

  • Asset
  • Item
  • Person
  • Work Order

Now, what if you would like to customize the fields displayed in the pop up dialog? Lets assume we do not use the Primary User/Custodian field in our application and do not want it to be a part of the dialog. It’s not as hard as you might think.

Follow these steps and you can change the displayed fields to your desire.

Step 1: Export the XML

1. Go into application designer for the appropriate application.
2. Select Action/Export System XML – Export RECHOVERS.XML
3. In that file, find the ‘asset_recordhover’ file

Step 2: Modify the XML

4. Remove the line with the ‘Primary User/Custodian’ attribute as below:

5. Save the file.

Step 3: Import the XML

6. Go back to Application and import the modified xml file

Step 4: Verify the change in the application

7. Verify the desired field is changed.

And that’s all there is to it. Pretty simple, and very useful tool.

IBM has put together a couple well done articles on this topic. You can fin them at the following links:
Adding/Deleting attributes in the Hover Dialogs in Maximo 7.6
Adding Hover Dialogs to additional fields in Maximo 7.6

Are you using Maximo Work Centers? If not, take a look at this and let us know if we can help get you there!

]]>
https://blogs.perficient.com/2018/03/16/maximo-hover-dialog-add-remove-fields/feed/ 0 214655
An Easy Way to Write HL7(TDS) Messages in XML Format in IIB https://blogs.perficient.com/2017/04/15/an-easy-way-to-write-hl7tds-messages-in-xml-format-in-iib/ https://blogs.perficient.com/2017/04/15/an-easy-way-to-write-hl7tds-messages-in-xml-format-in-iib/#respond Sun, 16 Apr 2017 01:25:50 +0000 https://blogs.perficient.com/ibm/?p=8432

The Perficient project lead handling an Enterprise Integration project at one of the largest healthcare providers in the country called me one afternoon and asked me to find a quick solution to write HL7 messages in XML format as some vendors that were being integrated needed the messages in XML. I knew that the solution had to be simple and I assured him of that.

Representing the HL7 message in XML format is not normative below v.3, and they are represented in physical format as TDS(Tagged/Delimiter String) messages. The TDS format of the message implies that the parser has the message schema(message set) available, which it uses to parse and validate the messages. The XML format is self-defining, while constraints on the message can be defined through XSDs .

I began to think about how to accomplish the task without accessing every HL7 field and copying every single field to the output XML. In IIB, data is represented in trees, parsed TDS messages are no different. After the incoming HL7 message is parsed, it is under the data part of the Root(Body). The names of the fields appear as defined in the message schema.

In view of this information, to accomplish the task,  I realized that it just suffices that the parsed content in the InputBody be assigned to the OutputRoot while using the XMLNS or XMLSNC parser. After this is done, while writing the output message, IIB writes the data in XML format. If the DFDL message set and parser are used, then the following has to be done:

SET OutputRoot.XMLNSC = InputRoot.DFDL;

Since, XML is self-defining, it does not need a message set to specify the physical format of the data. But, if validation of the values is required, it has to be made sure that the properties part of the Root contains a reference to the message set and of course, validation has to be turned on at appropriate exit points. Both DFDL and MRM message schemas in IIB are model driven and they will validate the values before writing the output XML. For example, if the name of the message set is HL7v251DFDLSharedLib, then:

SET OutputRoot.Properties.MessageSet = ‘{HL7v251DFDLSharedLib}’;

For physical formats of the input and output, please refer below:

Input message:

Output XML:

 

The field names in the output XML as defined in the message schema  can be changed by modifying it.

 

Note: 1) IBM provides a HL7 healthcare pack with message sets both in MRM and DFDL formats with various HL7 nodes. DFDL formats are defined for v 2.5 and above. To avail this, the IIB Healthcare pack has to be installed on the top of IIB installation.

2) HL7 v.3 now provides specification for the HL7 messages in XML.

]]>
https://blogs.perficient.com/2017/04/15/an-easy-way-to-write-hl7tds-messages-in-xml-format-in-iib/feed/ 0 214511
Creating XML Files Using Hierarchical Stage in IBM Datastage https://blogs.perficient.com/2017/01/16/creating-xml-files-using-hierarchical-stage-in-ibm-datastage/ https://blogs.perficient.com/2017/01/16/creating-xml-files-using-hierarchical-stage-in-ibm-datastage/#respond Mon, 16 Jan 2017 09:46:51 +0000 http://blogs.perficient.com/dataanalytics/?p=7254

XML files, being the most popular way for data transportation, could be the most sought ought way by many clients for moving the data around. Hence, it becomes inevitable for one to know how to create/parse/transform XML files in an ETL tool like IBM datastage. In this blog, we will look at how we could create an XML file out of simple flat files using data stage ETL stage Hierarchical Data.

About Hierarchical Data stage

The Hierarchical Data stage is available in the Real Time section of the palette in the IBM® InfoSphere® DataStage® and QualityStage® Designer. You can use the Hierarchical Data stage in parallel jobs not in server jobs. In parallel jobs, the Hierarchical Data stage can have multiple input and output links. You can use the Hierarchical Data stage as source stage, which has only output links; a middle stage, which has both input and output links; or a target stage, which has only input links.

This transformer is used for parsing and composing XML/JSON data, transforming hierarchical data format, and integrating external applications by consuming HTTP REST services

Please refer the link below for detailed step by step procedure on
Creating a XML file using – Composer and HJoin steps within hierarchical datastage

 

]]>
https://blogs.perficient.com/2017/01/16/creating-xml-files-using-hierarchical-stage-in-ibm-datastage/feed/ 0 200183
Solving Transformation and Routing of an EDI 837 Health Claim https://blogs.perficient.com/2016/04/05/solving-transformation-and-routing-of-an-edi-837-health-claim/ https://blogs.perficient.com/2016/04/05/solving-transformation-and-routing-of-an-edi-837-health-claim/#respond Tue, 05 Apr 2016 12:10:00 +0000 http://blogs.perficient.com/integrate/?p=1126

Health Level Seven International (HL7) is a not-for-profit, ANSI-accredited standard-developing organization dedicated to providing a comprehensive framework and related standards for the exchange, integration, sharing, and retrieval of electronic health information that supports clinical practice and the management, delivery, and evaluation of health services. We use these standards to communicate and exchange information at the integration layer with Apache Camel, Active MQ, and the Smooks framework.

JBoss Fuseshutterstock_317047289

JBoss Fuse combines several technologies, Apache Camel, Apache CXF, Apache ActiveMQ, Apache Karaf, and Fabric8 in a single integrated distribution. We will be using the community edition of JBoss Fuse for this example.

What is Smooks?

Straight from the Smooks page (http://www.smooks.org):

“Smooks is an extensible framework for building applications for processing XML and non XML data (CSV, EDI, Java etc.) using Java.”

While Smooks can be used as a lightweight platform on which to build your own custom processing logic for a wide
range of data formats, “out of the box” it comes with some very useful features that can be used individually,
or seamlessly combined together:

  • Java Binding – Populate a Java Object Model from a data source (CSV, EDI, XML, Java etc). Populated object models can be used as a transformation result itself, or can be used by (e.g.) Templating resources for generating XML or other character based results. Also supports Virtual Object Models (Maps and Lists of typed data), which can be used by EL and Templating functionality.
  • Transformation – Perform a wide range of Data Transforms – XML to XML, CSV to XML, EDI to XML, XML to EDI, XML to CSV, Java to XML, Java to EDI, Java to CSV, Java to Java, XML to Java, EDI to Java etc.
  • Huge Message Processing – Process huge messages (GBs) – Split, Transform and Route message fragments to JMS, File, Database etc destinations.
  • Message Enrichment – Enrich a message with data from a Database, or other Datasources.
  • Complex Message Validation – Rules based fragment validation.
  • Combine Perform – Extract Transform Load (ETL) operations by leveraging Smooks’ Transformation, Routing and Persistence functionality.

Using Smooks with JBoss Fuse and Camel

The approach to engaging Smooks in JBoss Fuse with Apache Camel is with a Java Bean called within the route. A Java Bean transforms the payload from a String to XML using the Smooks framework in a route.

Submit an EDI 837 Health Care Claim with Partner Picking Up Claim

The Health Insurance Portability and Accountability Act was enacted by the U.S congress in 1996. A key component of HIPAA is the establishment of national standards for electronic healthcare transactions and national identifiers for providers, health insurance plans, and employers. The HIPAA EDI transaction sets are based on X12. The use case for this example is EDI 837 transaction set, Healthcare Claim: Example Web Service (using REST dsl) that accepts an EDI healthcare claim as plain text EDI and processes it to XML with the Smooks framework and puts on a queue for partner to pick up.

Route 1 (Receive Claim) – REST web service listens on http for claims to process. Upon an inbound request, the web service forwards to Route 2 (Process Claim)

Route 2 (Process Claim) – The route using Smooks framework marshals the payload to XML from String and puts on a queue for partner to pick up.

Route 3 (Pickup Claim) – This route simulates a partner picking up the claim. The claim is written to a file for further processing.

usecasesmooks

Figure 1.

If you want to try out the example use case, it is at:

https://github.com/thiswebs4u/editoxmlexample.git

Refer to Readme.md for installation and use.

References:

http://www.hl7.org/
https://en.wikipedia.org/wiki/Health_Level_7
http://www.jboss.org/products/fuse/overview/
http://www.smooks.org/mediawiki/index.php?title=Main_Page

If you have questions or issues with these instructions, please share below.

]]>
https://blogs.perficient.com/2016/04/05/solving-transformation-and-routing-of-an-edi-837-health-claim/feed/ 0 196172
Integrating Datapower XI 52 with Operational Decision Manager https://blogs.perficient.com/2015/02/19/integrating-datapower-xi-52-with-operational-decision-manager/ https://blogs.perficient.com/2015/02/19/integrating-datapower-xi-52-with-operational-decision-manager/#respond Thu, 19 Feb 2015 18:24:20 +0000 https://blogs.perficient.com/ibm/?p=3709

In recent times, the need to follow government regulations and identify fraud to avoid financial loss as become more prominent. This forced companies in financial and health care domains to implement stricter business processes. The need to adapt and build better business process requires to use Business Rules Management System (BRMS) which will allow to modify and implement the business rules on the fly. In technical terms, it established a need to use BRMS. IBM offering of the BRMS is IBM Operational Decision Manager (ODM). In any given architecture to make a component reusable and abide to modular design is to integrate the component to Enterprise Service Bus. This will allow to improve the system with minimal impact to other systems.

Why to integrate Datapower XI 52 with WebSphere Operational Decision Manager (ODM)?

Both the products complement each other well. The core competence of the Datapower XI 52 is capability to process XML fast, good security options and integration capability with different systems.

Advantages of Integration

  • Flexibility: Ability to expose rules as JSON over REST, as of writing ODM only provides ability to expose rules as XML over JSON.
  • Security: Avoid exposing the ODM directly outside the enterprise. Datapower has capability to block Distributed Denial of Service (DDoS) attack
  • Unified Process: The integration helps to orchestrate the processes better. This will allow build as unified process with check and balances for calls related to BRMS.

One way of integration Datapower to ODM

ODM exposes rules using Hosted transparent Decision Service Interface (HTDS). HTDS will allow any deployed rules package to expose rules using REST. For our demonstration, we will use a Datapower XI 52 Virtual Machine (Version 6) with Operational Decision Manager (Version 8.5) with complete mini loan rules app.

Step 1: Deploy the rules app

You can build the Mini loan rules app using the tutorial provide by the IBM here and complete the deployment.

Step 2: Download WADL or WSDL

Once application is deployed open the Rule Execution Server Console.

Go to Explorer -> Click corresponding rules set -> click RetrieveHTDSDescriptionFile (as shown below)

RetriveHTDSDescriptionFile

 

Select Latest Rules Set and Latest RuleApp version and click download.

DownloadHTDSFile

Step 3: Setup a pass through Multi-Protocol Gateway

This will act as pass through for now but can be used for XML to JSON conversion, security setup and other use cases.

Step 4: Test

We are using POSTMAN plugin in chrome to test this service.

Using the WADL we download we can build the request for the service.

The request URL looks like

http://<DatapowerHostAddress>:<MPGPort>/DecisionService/rest/v1/myruleapp/myruleproject?Accept=application/xml&Accept-Language=en

Sample Input

<par:Request xmlns:par=”http://www.ibm.com/rules/decisionservice/Myruleapp/Myruleproject/param”>

<!–Optional:–>

<par:DecisionID>test5</par:DecisionID>

<!–Optional:–>

<par:borrower>

<creditScore>600</creditScore>

<!–Optional:–>

<name>string</name>

<yearlyIncome>80000</yearlyIncome>

</par:borrower>

<!–Optional:–>

<par:loan>

<amount>25000</amount>

<approved>true</approved>

<duration>240</duration>

<yearlyInterestRate>0.05</yearlyInterestRate>

</par:loan>

</par:Request>

Sample Output:

<?xml version=”1.0″ encoding=”UTF-8″?><par:Response xmlns:par=”http://www.ibm.com/rules/decisionservice/Myruleapp/Myruleproject/param”>

<par:DecisionID>test5</par:DecisionID>

<par:loan>

<amount>25000</amount>

<approved>true</approved>

<duration>240</duration>

<yearlyInterestRate>0.05</yearlyInterestRate>

</par:loan>

</par:Response>

Execution Trace

ExcutionTrace

PostManTesty

By this test we are successfully tested the integration between Datapower XI 52 and ODM. The sample loan application execution trace shows the rules trigged when service is hit.

References

  1. Using Hosted Transparent Decision Service Interface (HTDS) in IBM Operational Decision Manager
  2. Mini loan Web application

 

]]>
https://blogs.perficient.com/2015/02/19/integrating-datapower-xi-52-with-operational-decision-manager/feed/ 0 214158
DataPower’s handling of RESTful services via JSON https://blogs.perficient.com/2015/02/06/datapowers-handling-of-restful-services-via-json/ https://blogs.perficient.com/2015/02/06/datapowers-handling-of-restful-services-via-json/#respond Fri, 06 Feb 2015 17:42:16 +0000 https://blogs.perficient.com/ibm/?p=3556

In the IBM DataPower world, JSON is the representational format used by the RESTful façade exposed by the appliance. Starting with firmware version 3.8, there has been a gradual increase in the number of ways that JSON payload is handled.

DataPower services that will handle and process JSON messages include:

  • Multi-Protocol Gateway (MPGW)
  • XML Firewall
  • Web Token Services (Firmware v7.0 & up)

These services offer additional options to a developer on what type of request or response to expect. Selection of type of payload results in sort of treatment the payload is to receive. Our main concern in this article is JSON payload and its processing and manipulation. So, let’s dive in:

Non-XML as the request/response type

The original JSON message is available as the INPUT context.

JSON as the Request / Response Type

  1. Incoming JSON payload is validated as a well-formed JSON document.
    • Simultaneously JSON parser enforces limits on the incoming JSON document
  1. Message is converted to JSONx (an IBM standard to represent JSON as XML).­
    • Simultaneously XML parser enforces limits on JSONx document as it is being created.
  1. The JSON message is now available in two context forms:
  • INPUT context (original message) input
  • _JSONASJSONX context (the converted JSONx message that can be manipulated with XSLT, just as any other XML)jsonasjsonx

 

Before firmware version 6.0, DataPower offered the following process of dealing with JSON payloads thru converted JSONx:

JSONxHandling

 JSONtoJSONx

 Source: for further examples visit here.

ValidateJSONx

 

 

 

JSONx data, _JSONASJSONX context, now can be validated using jsonx.xsd schema file in the store:// directory.

 

 

 

JSONx2SOAP

 

Using an explicit Transform action with a custom XSLT style sheet, JSONx message will now need to be transformed to desired data format that is expected by the back end.

SOAP2JSONx

 

 

 

 

 

Response from the backend will need to be converted to JSONx using a customer XSLT in a transform action, if client expects a JSON response.

jsonx2json

 

 

 

 

The appliance provides jsonx2json.xsl style sheet in the store:// directory that can be used to convert JSONx to JSON by applying an explicit transformation action to the desired context.

 

 

 

Starting with firmware v6.0 & later, in addition to the above capability, now available INPUT context, the original JSON payload could be processed in 3 different ways:

     1.   Validate JSON schema

 clientSpecificJSV

         Source: JSON schema validation explained here.

     2.   Manipulated with a transform action that uses a processing control file such as:

JSONiq

JSONiqProcessingControl

 

 

XQuery transformation with JSONiq to:

  • Text
  • XML
  • another JSON
  • etc.

 

 

  Source: see JSON transformation to different data types here.

     3.   Manipulated using a GatewayScript (firmware v7.0 & up)

GatewayScriptTransformation

Source: More about GatewayScript here.

 


Considering Parser limitations for JSON payloads

JSON parser settings and XML parser settings work together to provide configurable settings for message size, nesting depth, and other limits.

  • JSON parser settings apply to the JSON message
  • XML parser settings apply to XML and converted JSONx messages

The parser limits helps protect against a denial-of-service (DoS) attack when a single maliciously huge message is sent to the service to keep it busy.

Default limits are applied unless Setting Configurations are specified. The maximum message size limit applies to JSON, SOAP, XML, and non-XML message types.

  • Limits are not enforced for pass-through message type.

More restrictive limits of both the JSON and XML parser are enforced. Exceeding either parser limit results in an HTTP 500 response code being returned to the client.

SourceSee the parser limits Here: JSON 


]]>
https://blogs.perficient.com/2015/02/06/datapowers-handling-of-restful-services-via-json/feed/ 0 214153
DataPower Configuration Management tool https://blogs.perficient.com/2015/01/12/datapower-configuration-management-tool-part-i/ https://blogs.perficient.com/2015/01/12/datapower-configuration-management-tool-part-i/#respond Mon, 12 Jan 2015 09:07:41 +0000 https://blogs.perficient.com/ibm/?p=3333

DCM (DataPower Configuration Manager) is an open source tool published by IBM for automating and simplifying the configuration and management of IBM DataPower appliances (with the exception of the XC10).  It can be used standalone OR within IBM UrbanCode Deploy platform.  Business partners, clients or anyone else for that matter can contribute enhancement or fixes to the tool.

DCM uses DataPower’s XML Management Interface (XMI) to interact and manipulate appliance’s management tasks in automated fashion. DCM uses ANT scripting to fully automate the build, deployment and release process of DataPower configuration and management.

One does need to be familiar with ANT to effectively use DCM. A simple ANT build file, deploy.ant.xml, included with the DCM distribution serves several simple purposes:

  • Common deployment tasks at domain level and some at device level.
  • DCM provided ANT based task and targets for custom deployment builds.

Some of the common deployment tasks provided in deploy.ant.xml include create, delete, backup (export), restore (import), reset, restart, quiesce, unquiesce, and save domains. Additional task are delete/restore/save checkpoints, Upload files or directories, valcred creation, create/delete/modify objects, and create/remove host aliases.

Deploy.ant.xml build file relies on certain ANT properties such as host, domain, uid, pwd, port, and some of the file location for backup, import, export, and upload files in addition to upload-to and upload-from directory locations. One very important property is the dcm.dir that points to the directory where DCM is installed. These properties could be provided in a <filename>.properties file and included within the custom build script or provided within an ANT call, such as following:

Ant –f deploy.ant.xml –propertyfile <filename>.properties check-access

Where check-access is a target defined within ANT script to checks access to DataPower device using the provided host, user id, and password.

Once installed and properly functioning, Using DCM effectively requires a little planning such as how to structure property files and the DCM definition files to best fit your needs. Thus little exploration of DCM directory and included files would be beneficial.


Step by step instructions and files :

Download the DCM zip file here.

Prerequisites:

1)    JDK 1.6 or later

2)    Apache Ant 1.8.1 or later (DCM is packaged with 1.9.4 and UCD plugin)

3)    Xalan-J for Oracle/SUN JDK version (comes included with IBM JDK)

I have downloaded xalan-j 2.7.1-src-2jars for stability.

Installation:

  1. Ensure JDK is installed and is included in the PATH
  2. Ensure Ant 1.8.1 is installed and is included in the PATH
  3. Include following five Xalan JAR files in the CLASSPATH
    1. xalan.jar
    2. serializer.jar
    3. xml-apis.jar
    4. xercesImpl.jar
    5. xsltc.jar
  4. Ensure DataPower device’s XML Management Interface (XMI) is enabled
  5. Test if DCM is properly working using the following command

ant -f deploy.ant.xml -Ddcm.dir=<path of dcm directory> -Dhost=<IP/hostname of the DP device> -Duid=<user ID> -Dpwd=<user Password> check-access

*use -Dport=<port #> if XML Management Interface is enabled on a port other than the default (5550)

*if -Dpwd=<user Password> is omitted in the above command then user will be prompted for password


Further test with other tasks:

Create or delete a domain by introducing the following in the above command instead of check-access

-Ddomain=<domain name> domain-create save

-Ddomain=<domain name> domain-delete save


Create a <fileName>.properties file that contains these ANT variables.

# DP credentials 

dcm.dir=<path of dcm directory>

host=<IP/hostName of the DP device>

uid=<user ID>

pwd=<user Password> (this could be omitted for security reason. Omitting it will prompt the user for password)

Use the following command to create a domain and save the configurations.

ant -f deploy.ant.xml -propertyfile <fileName>.properties domain-create save


While domain-create is a target within deploy.ant.xml, it utilizes the task of createDomain to serve the purpose.

DCM provides a number of tasks that can be utilized in customized build files. These tasks are defined in dcm\dcm-distros\dcm_1.0.1\src\dcm-taskdefs.ant file

]]>
https://blogs.perficient.com/2015/01/12/datapower-configuration-management-tool-part-i/feed/ 0 214134
Automating REST services using Soap UI Pro https://blogs.perficient.com/2014/12/02/automating-rest-services-using-soap-ui-pro/ https://blogs.perficient.com/2014/12/02/automating-rest-services-using-soap-ui-pro/#respond Tue, 02 Dec 2014 11:22:10 +0000 http://blogs.perficient.com/delivery/?p=3223

“A Web service is a method of communication between two electronic devices over a network. It is a software system designed to support interoperable machine-to-machine interaction over a network.”

There are various tools available to test web services. Some of them are Soap UI, Soap UI Pro, Test Maker, Web Inject etc. The most common tool we use is Soap UI.

Soap-UI Pro (the licensed version) has come up with a user friendly UI. It gives utility to create the test data (to read/write from external files) step, create a data connection (to read/write from database tables) step and property Transfer (to transfer property between steps) step. It also provides a utility called Groovy script to achieve any validations/operations not possible with default SOAP UI steps.

Let’s look at the task that needs to be performed:-

Objective: – To automate a sample login (authenticateMember) REST web service using Soap UI Pro.

Resources: – Soap UI Pro tool, Input Excel data file, Output Excel data file.

Solution: – I will be using pre-defined soap UI steps to read/write data from external excel file and groovy script to perform few validations/operations those are not possible with the default steps.

Step 1:- Prepare the Input Excel data file (Sheet Name- Login)

InputDataSheet

InputDataSheet

Assumption is that the service accepts username and password; we are passing both the parameters from the test data input sheet.

Step 2:- Prepare the Output Excel data file (Sheet Name – Login)

OutputDataSheet

OutputDataSheet

Assumption is that the service returns statusCode and statusDesc. Let’s see what the extra output fields are:-

  • testResult – Insert either of Pass/Fail based on assertions.
  • statusCode – Insert the status code returned from the REST response.
  • statusDesc – Insert the status description returned from the REST response.
  • Request – Insert the raw REST request.
  • Response – Insert the complete REST response returned from the service.

Step 3:- Create the Automation test suite.

  • Open Soap UI Pro, import your project.
  • Go to the webservice endpoint and right click to see the context menu.
  • Click on option “Generate TestSuite”.
Context Options

Context Options

  • In the Generate Test Suite pop-up, check the checkbox authenticateMember (sample service name) service and click OK.

 

Add Service Request

Add Service Request

  • Once done, we will be able to see the test suite (authenticateMember) generated with the selected service.

ProjectTree-1

 

Step 4:- Create testcase under test suite.

  •  Go to the created test suite and right click to see the context menu.
  •  Click on option “New TestCase”.
Context Options

Context Options

  • In the New Test Case pop-up, give the name of the new testcase and click OK.
NewTestCase

NewTestCase

 

  • Once done, we will be able to see the test case (AuthenticateMember) generated.
AuthenticateMember

AuthenticateMember

 

Step 5:- Create test steps under created test case.

(i)  Create DataSource step to read username and password from the test data input sheet

  • Go to the AuthenticateMember testcase and right click to see the context menu.
Context Options

Context Options

 

 

 

  • Click on option “DataSource”.
  • In the Add Step pop-up, give the name of the step and click OK.
DataSource

DataSource

  • Once done, we will be able to see the test step (DataSource) generated.
DataSource Step

DataSource Step

 

  • In the DataSource step, click on the dropdown next to DataSource label and select “Excel”.
DataSource Configuration

DataSource Configuration

 

  • Since we need 2 values (Username and Password), we will be creating 2 properties in this step to capture the values from the input data excel file. To create property, click on “+” sign on the left corner, name the property name and click OK.
AddProperty

AddProperty

 

 

  • Create properties with name “Username” and “Password”.
DataSource with properties created

DataSource with properties created

 

  • Now we need to browse for out test input data sheet in the File: text box, give the proper sheet name in Worksheet text box and give the cell number where Username and Password cells are located in the test input data sheet.
  • Once done run the test step, to run click on the green coloured Play button. We will be able to see the properties populated with the values we stored in test input data sheet.
DataSource executed

DataSource executed

 

(ii)  Create REST request Test Step

  • Assuming that authenticateMember REST request takes Username and Password as test input.
  • Go to the AuthenticateMember testcase and right click to see the context menu.
Context Options

Context Options

 

  • Click on option “REST Test Request
  • In the Add Step pop-up, give the name of the test step and click OK
REST Test Request

REST Test Request

  • Once done, we will be asked to select the REST request which we want to import. Select the appropriate REST request(depending on the project) from the dropdown and click OK
New Rest Request

New Rest Request

 

 

  • Once done, we will be able to see REST Test Request test step created.
REST Test Request created

REST Test Request created

 

(iii)  Parameterize the input data in created REST Test Request test step.

  • Go to the Request tab where we need to enter Username and Password.
REST Test Request with request parameters

REST Test Request with request parameters

 

 

  • To remove the hard coded values; right click on “admin” so that context menu appears and selects the value; GetData-> DataSource -> Username. Do the same for Password.
ParameterizeUsername

ParameterizeUsername

 

 

  • Once done, we will be able to see the input data has been parameterized.
Parameterized

Parameterized

(iv)  Execute the REST request and create assertions.

  • Assuming that if we pass valid Username and Password, statusCode=0 and statusDesc=Success is returned.
  • Hence we need to create assertions for the above two fields.
  • Go to statusCode field in REST response and right click so that context menu appears.
REST Response

REST Response

 

 

  • Select AddAssertion -> forContent.
Add Assertion

Add Assertion

  • Verify Xpath for the source node of “statusCode” and Expected Result
Xpath Match Config

Xpath Match Config

 

 

  • Do the same for statusDescription.
  • Once done, we will be able to see 2 Assertions created at the left bottom of REST request step.
CreatedAssertions

CreatedAssertions

(v) Create Groovy script to validate the assertions.
  • We will be writing a customized groovy code to validate the assertions whether they are PASSED or FAILED. If they are passed then we will insert “Passed” into the execution result column in output data sheet. Else insert “Fail” if any one of the assertions failed.
  • Go to the AuthenticateMember testcase and right click to see the context menu.
Context Options

Context Options

  • Click on option “Groovy Script
  • In the Add Step pop-up, give the name of the test step and click OK.
Groovy Script

Groovy Script

  • Once done, we will be able to see the test step (Groovy Script) generated.
Groovy Script step

Groovy Script step

(vi)  We will be writing code to validate the assertions.

//import class for assertion

import com.eviware.soapui.model.testsuite.Assertable

//Create variables

def testStepSrc = testRunner.testCase.getTestStepByName(“REST Test Request”)

def propertyVal = testRunner.testCase.getTestStepByName(‘DataSink’)

// Count number of assertions created in REST request

def counter = testStepSrc.getAssertionList().size()

// if status matches string “VALID”, then pass else fail

for(i=0;i<counter;i++)

{

String status=testStepSrc.getAssertionAt(i).getStatus()

if(status!=’VALID’)

{

output=’FAIL’

}

else

{

output=’PASSED’

}

}

testRunner.testCase.testSuite.setPropertyValue(“result”, output)

//Create a property Status and put the value either pass/fail in it depending on the assertions status

propertyVal.setPropertyValue(“Status”,output)

 

(vi)  Create DataSink step

  • DataSink step is required to setp up property values and the same can be directly inserted into the output data sheet.
  • Since we will be inserting 5 values (result, statusCode, statusDescription, request and response), we will be creating 5 properties in DataSink step and insert the same into output data sheet.
  • Go to the AuthenticateMember testcase and right click to see the context menu.
Context Options

Context Options

 

 

  • Click on option “DataSink” option.
  • In the Add Step pop-up, give the name of the test step and click OK
Data Sink

Data Sink

 

  • Once done, we will be able to see DataSink step created.
Data Sink step

Data Sink step

 

 

  • Create 5 properties and the values will get supplied from the propertyTransfer step (this will be added in the next step).
Create Data Sink properties

Create Data Sink properties

 

  • To create the properties follow the same approach as we followed while creating properties in DataSource
  • Enter the Configuration parameters; here we will be passing the path of output file, the specific worksheet and cell number on which we want to insert the values.
  • No need to give values to the created properties, this will get inserted from propertyTransfer step.

 

(vii)  Create PropertyTransfer

  • Property transfer is required to transfer property values from one step to another. We will be transferring property values from REST response and GroovyScript to Data Sink
  • Go to the AuthenticateMember testcase and right click to see the context menu.
Context Options

Context Options

 

  • Click on option “Property Transfer” option.
  • In the Add Step pop-up, give the name of the test step and click OK
Property Transfer

Property Transfer

 

  • Once done, we will be able to see Property Transfer step created.
Property Transfer step

Property Transfer step

 

 

  • We will be creating 4 property values (statusCode, statusDescription, request and response). The property status will be populated to DataSink step from GroovyScript.
Property Transfer created

Property Transfer created

  • After we are done creating the properties, we need to select Source, Property and Target, Property. This will enable transferring of properties. To transfer statusCode property

 

Source – REST Test Request

Property – ResponseAsXML

Xpath – node description of statusCode

 

Target – Data Sink

Property – statusCode

 This will transfer the statusCode value to DataSink’s statusCode property. Do the same for the other properties as well.

 

  • After you are done, execute this step. Open DataSink step and execute, we will be able to see property values for statusCode, statusDescription, request and response.
Data Sink step populated

Data Sink step populated

  • This is how the project flow will look like. Because the steps are dependent on one another hence proper flow is mandatory for the execution to go through.
ProjectTree

ProjectTree

 

 

Step 6:- Execute the test suite

  • Execute the testsuite by clicking on the Play button for the test suite.
Test Suite

Test Suite

 

 

  • After we are done, verify the DataSink We will be able to see all the property values.
Data Sink populated

Data Sink populated

 

 

  • Verify the output Data sheet, The data has been inserted to the output data file starting from the cell we mentioned in the DataSink step.
OutputDataSheet (After execution)

OutputDataSheet (After execution)

 

]]>
https://blogs.perficient.com/2014/12/02/automating-rest-services-using-soap-ui-pro/feed/ 0 210692