This blog is intended to show you how to upgrade firmware in Datapower.
Below are the steps for upgrading your firmware in Datapower:
1. Identify The Firmware Image File Which Matches Your Appliance
To upgrade the firmware in Datapower, we first need to identify the firmware image file which matches the appliance.
2. Download the Latest Firmware from IBM’s Fix Central Website
We can download the latest firmware image file using the below IBM link.
http://www.ibm.com/support/fixcentral
Login to the website using your IBM User Id and Password.
Select the relevant dropdowns as in the example below.
Product Group : WebSphere
Product : WebSphere DataPower SOA Appliances
Installed Version : Currently installed firmware version
Press Continue
Type the appliance type & press Continue
Ex : XI50
Select the appropriate fix that we need to upgrade.
For example, below we upgraded from 4.0.1.1 to 4.0.1.4:
Click Continue at the bottom of the page.
Select the download option as shown below & click next.
Agree to the terms and conditions by clicking I agree.
Select the crypt2 file as shown below and download the file to the local drive.
3. Below are the steps to Install the Firmware Image
From WebGUI –>ControlPanel –> Administration –> Configuration –> Application Domain –>Disable all the domains except the default domain –> Save the configuration
Log in to the default domain –> Control Panel –> System Control –> Boot Image –> From the Firmware File list, select the newly transferred firmware image and accept the terms of the license agreements check box –> Click Boot Image and follow the prompts and Wait for the appliance to reboot and for the appliance to restart.
From WebGUI –> ControlPanel –>Administration –> Configuration –> Application Domain –>Enable all the domains –> Save the configuration.
Below is the IBM Reference that I followed for this firmware upgrade :
http://www-01.ibm.com/support/docview.wss?uid=swg27015333
]]>
In this blog I’ve taking some liberty to extend the general definition of full stack, in order to describe a system architecture that lends itself well to full stack development. With the emergence of micoservices as an alternative to monolithic applications and service-oriented architectures; there is need to elaborate in more depth on an architectural viewpoint that builds on the various architectural patterns, which are often individually expressed as box-and-line diagrams.
Contextually this distributed system architecture is an augmentation of a REST-based centralized messaging topology. The principal goal of this architecture is to meet the demands of a cross-channel business model that will enable the development various UI solutions. Which will turn facilitate new customer digital experiences. With an ancillary benefit of positioning the enterprise to evolve an infrastructure for building microservices.
The Architectural Challenge
How do you create a system architecture that supports responsive Single Page Applications (SPA); used across customer markets that may diverge while simultaneously positioning your enterprise to support an API development strategy, as well as providing IT with an evolutionary approach for developing microservices?
A Proposed Solution
To meet this architectural challenge, lets extend the REST-based centralized messaging topology with an enhanced API Gateway pattern and then augment the pattern with a technical stack made up of the following architectural components:
Operational Decision Management (ODM)
Using ODM we can incorporate the dynamic behaviors for client channels that vary across markets. Add a distributed full-text search engine housing schema-free JSON documents and this component will support dynamic type-ahead and lookup features to enrich the user experience.
Incorporate a NoSQL database to support the SPA using node.js technology. Use cases for this component will vary based on architectural constraints or challenges. Client drivers, security, potential storage capacity, data integrity, and performance to name a few concerns.
The API solution incorporates a Developer Portal to expose bit internal and public APIs. Introducing the portal even before the organization has started down the path of exposing business capabilities is a good practice. Management of APIs internally lends itself well to agile development. Additionally, it enables the organization to flesh out a well-reasoned API management strategy, and helps to facilitate the dialog of aligning business capabilities to the IT resources models in the important data modelling process of service development.
Although orchestration avoidance is a desired architectural constraint, it may be unavoidable based on information models in the legacy EIS. And as in many legacy environments there may remain a number of EIP Use Cases such as; control and routing, aggregation, transformations, managed file transfers, to mention a few. Additionally, the ESB provides a range of transport and connectivity components for the range of EIS technology variants.
This bring us to microservices. This article is not intended to take a deep dive into micorservice design or implementation. But rather to suggest that with this system architecture we now have a foundation for an evolutionary path away from the monolithic application style to combination of API and microservices architectural style.
At a minimum there are three architectural concerns to consider.
First, the microservices architecture style was prompted primarily through the development of continuous delivery, and the notion of a continuous deployment pipeline, to streamline application deployment.
The second concern is the distributed nature of services. Where all architectural components are fully decoupled and are accessed through some sort of remote access protocol (e.g., JMS, AMQP, REST, SOAP, etc.). Which enables a highly scalable design and enables an agile deployment approach.
Perhaps the concern that is most important, is the concept of service component granularity and modularity which may range from a single-purpose function, to a larger business application. This is grounded in two fundamental principles. First, microservices are business units that model the company processes, and second, microservices have smart endpoints to expose the business logic and they communicate using simple channels and protocols.
]]>
IBM API Connect – Product Architecture
IBM API Connect is an integrated offering, which provides customers an end to end API life-cycle (Create, Run, Secure and Manager) experience. It provides capabilities to build model-driven API’s , micro-services using Node.js loopback framework , run them either on-premise or on-cloud, secure and manage the same using management, gateway server and developer portal.
In this blog post, I’ll cover the component description, the sort of data they hold and their interaction with each other at design time and run-time.
Management Server provides tools to interface with various servers and holds following data. It runs Cloud Management Console (used by cloud administrator to create, manage and monitor the cloud and lot of other admin tasks) and API Manager portal (used by API developers, Product managers, admins etc.)
Gateway Server is an entry point for API calls. It processes and manages security protocols and stores relevant user authentication data, provides assembly functions that enables APIs to integrate with various backend endpoints, and pushes analytics data back to the Management server.
MicroGateway – It’s built on node.js and provides enhancement for the authentication, authorization and flow requirements of an API. It is deployed on API Connect collective and has a limited number of policies available to it.
DataPower – Deployed on either virtual or physical Data Power appliance. It has more built-in policies available to it than MicroGateway and can handle enterprise level complex integrations.
Developer Portal – API providers publish the Products and APIs to the developer portal for application developers to access and use. Application developers need to sign up and register their application to use the APIs.
After registering the application, they can the select appropriate plan(collection of REST api operation and SOAP API wsdl operations) to use with their application . They can test the API using the built-in test tool. They can even view analytics information relating to APIs used by the application.
Developer Toolkit – It provides a command line tool, for creating and testing APIs, loopback application that you can run, manage and secure with IBM API Connect. We can use this to script tasks such as continuous integration and delivery.
Install this either from npm or from a management server in your IBM API Connect cloud. Following this link to take a look at available commands.
API Designer – apic edit command runs the API designer and opens in the default web browser. We can leverage Web GUI to create the Loopback project, OpenAPI (Swagger 2.0) and secure them. We can create a Product/Plan and after testing the API successfully, we can publish the product, loopback application to Bluemix or On-prem instance.
Design-time Interaction
Run-time Interaction
]]>
The database has become a vital component of any enterprise’s IT structure. Databases continue to provide more and more sophisticated options than versions we have seen in previous years.
IBM Datapower gateway is designed to connect to most of the popular databases.
In his most recent post, Technical Consultant, Karthik Selvaraj walks you through the steps to follow if you are looking to configure a SQL data source in Datapower gateway. You can read his full post here.
]]>
IBM DataPower Gateway is a purpose-built security and integration platform for mobile, cloud, API, SOA and B2B workloads. The current business model for nearly all industries relies mostly on data. The database has become a very important component of an enterprise’s IT structure providing more sophisticated and flexible options than the databases available a decade before. This digital transformation has placed demands on Enterprise Service Bus (ESB) products to connect with databases at higher speeds with the options of supporting multiple types of databases.
According to a report generated in September 2016, Oracle, My SQL, Microsoft SQL server and DB2 are the most popular Relational DBMS used all over the world. Considering the wide use of these databases, IBM Datapower gateway ( Firmware edition V7.5) is designed to connect to most of the popular databases and they are listed below:
The configuration of a data source to connect to a database through IBM Datapower gateway is achieved through a component called SQL data source. This article will provide the steps to configure a SQL data source in Datapower gateway. The database used for this illustration is Microsoft SQL server 2014 and the IBM Datapower Gateway firmware version is 7.5.
Step 1:
Obtain the following details about the database from the concern team.
Step 2:
In this example the database used is Microsoft SQL server and the sample values for the above mentioned parameters will be as follows:
Note: The user type used for authentication should be SQL server authentication. Please don’t use a user account with Window’s authentication.
Step 3:
Login to the Datapower gateway physical or virtual appliance and navigate to the Password map alias component under Objects > Configuration Management setting. Create a password alias configuration with the password of the user account which will be used in the SQL data source object.
Once configured, click Apply and save the configuration.
Step 4:
Navigate to the SQL data source object available under network > Other setting to create a new SQL data source object.
Step 5:
Create a new SQL data source and configure the details with the details gathered about the database.
Once configured, click Apply and save the configuration.
Step 6:
Enable debug level logging to verify if the configuration is correct. In case any of the detail is incorrect or a network connection is blocked, the SQL data source object will be in a pending status.
Reference:
http://www-03.ibm.com/software/products/en/datapower-gateway
http://db-engines.com/en/ranking
]]>
Exposing patient health information as Application Program Interfaces (APIs) is one of the most critical components in Stage 3 of the EHR Incentive Programs and all providers will be required to comply with MU3 requirements by 2018. The APIs will ensure improved patient engagement by providing data access in application of patient’s choice instead of current patient portal channel only.
In compliance with HIPAA privacy and security rules, the implementation of APIs that expose sensitive PII/PHI data must be properly protected in terms of Authentication, Authorization, Audit, Message and Transport Level Security, Encryption etc.
Some of the key requirements for the API interface are:
Solution using IBM APIC:
]]>
XSLT is not the ONLY transformation language supported by DataPower. Starting with the firmware version 7.0.0.0, DataPower supports a new transformation technology – Gateway Script, to handle all sorts (API, B2B, Web, Mobile, SOA) of traffic effectively. For more information, please review the documentation.
IBM provides an interactive website that lets you write Gateway Script code and execute on a cloud hosted DataPower Gateway for learning purposes. The website provides many examples that you can test as it is or edit based on your requirements. It also provides separate tabs to modify the sample code, provide request, view Response and logs.
I tested the following use-case in just a few seconds. There was no need to configure services, or policies to test the transformation piece through Gateway Script.
Use-case : Modify incoming JSON request payload (Add new object in Books array)
Step – 1 : Clicked on 1st sample in Code tab, tweaked the request as shown below.
Step-2 : Provided the request in Request tab as shown below. Didn’t modify anything in HTTP headers field.
Step – 3 : View the response using Response tab.
Step – 4 : View the datapower system logs using Logs tab.
]]>
One of the nation’s largest marketer of tires for auto replacements, required a secure, scalable and flexible user friendly point-of-sale application.
Perficient partnered with them to implement a new system for sales associates, which provides a faster way to access enterprise data and ultimately provide better customer service.
In order to accommodate increasing workloads, the SOA-based solution was built using WebSphere Message Broker, MQ, eXtreme Scale and DataPower. After the implementation, our client saw a 15% increase in retail sales due to an enhanced customer experience. The development of a smarter supply network also improves interactions with supplier and manufacturers.
]]>This post focuses on upgrading the Crypto profile objects configuration (keep TLS and disable rest) on DataPower. If you face any issues in establishing the TLS connectivity from Soap UI, follow these steps to fix the issue:
Step – 1: Navigate to C:\Program Files\SmartBear\SoapUI-5.2.1\bin folder.
Step – 2 : Edit SoapUI-5.2.1.vmoptions file with any text editor.
Step – 3 : Add following entry and save the file. It will only enable TLS 1.2 protocol.
-Dsoapui.https.protocols=TLSv1.2
If you have a requirement to enable other TLS version too, then use comma separated values as shown below.
-Dsoapui.https.protocols=TLSv1,TLSv1.1,TLSv1.2
Step – 4 : Close and Re-launch the Soap UI.
]]>
This issue mainly occurs when DataPower MQ client tries to establish a SSL(2-way) connection with MQ server.
Here is the solution to fix this issue.
Step – 1 : Ensure that correct certs have been configured in keystore of MQ queue manager object on MQ server.
Step – 2 : Ensure that correct certs have been configured in CryptoIdentificationCredentials and CryptoValidationCredentials objects on DataPower.
Step – 3 : Get the Cipher details configured at Channel level and compare it with Cipher field in SSLProxyProfile object configured in MQ queue manager object on DataPower. In most of the cases, its value is DES-CBC3-SHA.
Step – 4 : Refresh the SSL Settings at Queue Manager level on MQ server and restart the channel.
]]>In my previous article, I covered JSON Web Token and how to issue and validate it on data power firmware v 7.2.0.0 using custom gateway scripts. In this article, I will cover the issuance and validation of JWT with AAA action on data power firmware v 7.2.0.1.
No need to write the code. Leverage the built-in feature of AAA action to issue and validate JWT.
Here is the step-by-step guide to generate the JWT:
Step – 1: Configure the AAA action on Request rule in processing policy.
Step – 2: Select the Identification method. Mostly in case of REST services, Identification method is HTTP Authentication Header as shown in below screenshot.
Step – 3: Authenticate the user with LDAP or locally with AAAInfo.xml file.
Step – 4: Extract the resource and authorize the user. Again, either you can make a call to LDAP, retrieve the groups associated with user and authorize or locally with AAAInfo.xml file.
Step – 5: Post successful Authentication and Authorization, its turn now to generate the JSON Web Token.
Step -6: Commit and apply AAA policy changes.
Here is the step by step guide to validate the JWT.
Step – 1: Configure the AAA action on Request rule in processing policy.
Step – 2: Select the Identification method as JWT as shown below.
Step -3: Create a new JWT Validator object.
Step – 4: Configure the JWT Validator object.
Step – 5: In User Authentication step, select the option shown in below screenshot.
Step -6 : Commit and apply AAA policy changes.
]]>
This tutorial series explains how to issue and validate different types of tokens such as JWT(JSON Web Token) , SAML HoK(Holder-of key) using IBM DataPower gateway. In this article, you learn about the issuance and validation of JWT with firmware v 7.2.0.0.
In Part-2, you will learn to issue and validate the JWT with firmware v 7.2.0.1 in much simpler way. In Part-3, I’ll explain about issuance and validation of SAML HoK token used for SOAP based services.
What is JSON Web Token
JSON Web Token (JWT) is a compact claims representation format intended for space constrained environments such as HTTP Authorization headers and URI query parameters. JWTs encode claims to be transmitted as a JavaScript Object Notation (JSON) object that is used as the payload of a JSON Web Signature (JWS) structure or as the plain-text of a JSON Web Encryption (JWE) structure. JWTs are always represented using the JWS Compact Serialization or the JWE Compact Serialization.
Here is an example of signed JWT. It’s comprised of 3 parts(highlighted in different colors) separated by a period (.)
Ist part is base-64 encoded JWS header value which contains information about signing algorithm. You can use any of the following algorithm to sign the Claim-set.
Asymmetric -> RS256, RS384, RS512
Symmetric -> HS256, HS384, HS512
2nd part is base-64 encoded JSON claim-set.
3rd part is base-64 encoded signature value generated after signing the encoded JWS header and payload (claim-set) with algorithm specified in JWS header.
eyJhbGciOiJSUzI1NiJ9.eyJpc3MiOiJpYm1fZGF0YXBvd2VyIiwic3ViIjoiYWRtaW4iLCJleHAiOjE0NTAxMTUyODAuMTkyLCJuYmYiOjE0NTAxMTE2NzkuMTkyLCJpYXQiOjE0NTAxMTE2ODAuMTkyLCJqdGkiOiI3ZjY2NGYxNi05OGQyLTRlYzEtODlhOS04NjM3ODBkYjFhNjgiLCJhdWQiOiJBQUFNb2JpbGUifQ.G7XRUjxrvRSdFE_RRumrPtTnLvlX36eRqDC0UFZKiO3Jau9iDbPuGPeGc0g0kUrubGQAqXz1TYTAuwcNnF58FWQjm9ovZrFH-fvGEpiYKjSctAsldj_ecQRw4jX5YKOYd1zbdr67-zUJN0n8J1iNJiJeVyGBCvz7jiiwCcZSXGRUkAqy-zwq_jULfZoi7QIS1s4f_K5WeQu4PVEhe30tovffegHdxAPZm0ptQT88l3UuuC5zNW7QxQH-6MywLvI3jYttrJ_jhIXUiNFyWDSkNKbcfUwjV2ez5IlPMfQgVFVoMMecaxJ5qlzRr8-okrpgaSQt5xx6gIL-gEZtV7Cd5g
Standard/Registered Claim names
None of the claims defined below are intended to be mandatory to use or implement in all cases, but rather, provide a starting point for a set of useful, interoperable claims. Applications using JWTs should define which specific claims they use and when they are required or optional. All the names are short because a core goal of JWTs is for the representation to be compact.
Take a look at following link to get more details around these claim names. We can even define the custom claims based on the requirement.
Using Firmware 7.2.0.0
As most of you will be aware of that Data Power firmware v 7.2 provide enhanced message-level security for mobile, API, and web workloads by using JSON Web Encryption for message confidentiality, JSON Signature for message integrity and JSON Web Token to assert security assertions for Single Sign On (SSO).
Though Firmware 7.2 does provide actions to Sign, Verify, and Encrypt and Decrypt the JSON payload but there are no such actions available to generate and validate JSON Web Tokens. You have to write the Gateway Script to perform these functionalities.
Here are the sample Gateway scripts that I developed to generate and validate JWT.
Post successful Authentication/Authorization, configure the following gateway script in GatewayScript action in Request processing rule to issue the token.
createJWT.js
// Import Required packages
var jose=require(‘jose’);
var jwt=require(‘jwt’);
var sm=require(‘service-metadata’);
sm.mpgw.skipBackside=true;
session.INPUT.readAsJSON(function(error,json)
{
if(error)
{
session.output.write(‘Error reading JSON’ + error);
}
else
{
var claims={
“iss”:”ibm_datapower”,
“aud”:”Audience_name”, // Replace ‘Audience Name’ with actual value.
“iat”: new Date().getTime(),
“exp”:(new Date().getTime()) + 10000, //Token will get expire in 10 sec.
};
//Sign the token with RS256 algorithm. Replace ‘Crypto Key Object Name’ with actual object name created on box.
var jwsHeader=jose.createJWSHeader(‘Crypto Key Object Name’,’RS256′);
var encoder=new jwt.Encoder(claims);
encoder.addOperation(‘sign’,jwsHeader)
.encode(function(error,token) {
if (error) {
session.output.write(‘Error creating JWT’ + error);
}
else {
session.output.write(token);
}
}
);
}
}
)
For validation, pass the JWT in HTTP header as shown below.
Authorization : Bearer “JWT string”
validateJWT.js
//Import Required Packages
var jwt=require(‘jwt’);
var hm=require(‘header-metadata’);
var sm=require(‘service-metadata’);
sm.mpgw.skipBackside=true;
//Retrieve Authorization HTTP Header value.
var bearertoken=hm.current.get(‘Authorization’);
// Retrieve the JWT token.
var buff=bearertoken.substring(7);
var jwttoken=buff.toString();
var decoder=new jwt.Decoder(jwttoken);
decoder.addOperation(‘verify’,’Crypto Cert Object Name’)
.addOperation(‘validate’,{‘aud’:’Audience_Name’})
.decode(function(error,claims) {
if(error)
{
session.output.write(‘Error validating JWT’ + error);
}
else
{
session.output.write(claims);
}
})
]]>