Perficient IBM blog

Subscribe to Emails

Subscribe to RSS feed

Archives

Follow IBM Technologies on Pinterest

Apply an OOAD approach to the TOAGF ADM

Describing a rich EA framework and process using a standard object oriented development approach.


Recently I had had an interesting conversation with a solution architect around the TOGAF framework and the ADM. I was asked the following question: “Can you provide a short but reasonable description for tailoring the framework while instantiating the ADM as a practical implementation?”. This was certainly an interesting challenge given the scope and depth of the TOGAF ADM.

So I took up the task with the concept of “dogfooding”. Why not apply tools and techniques describe in the ADM and other methods as an approach. So, I started with an enterprise architecture modeling tool that supports TOGAF modelling elements and UML notation. I then applied object oriented analysis techniques to flesh out a set of contextual models.

My objective; provide a concise and practical example of the ADM using a model based approach for developing architectural capabilities. Which is the same method for developing a solution architecture that delivers a business capability.

TOGAF for all practical purposes may be thought of as a tool box of artifacts. With the underlying ADM as the process by which those artifacts are prescribed and applied. In general this is often expressed by the use of the TOGAF crop circle diagram; artifacts excluded.

While the crop circles offer a simplified standard and popular expression of the ADM. Its a bit difficult to translate that viewpoint into a description for a practical implementation. Especially if someone new to implementing the ADM. And when adding the depth and richness of the ADM another challenge arises. Particularly, how to provide simplified contextual viewpoints which correlate to TOGAF artifacts as deliverables.

In offering one solution to these challenges I began by creating a couple of architectural context models by:

  • Creating a work package viewpoint of high-level requirements.
  • Creating a work package viewpoint of domains used in the ADM, which are abstracted as static components.

By abstracting domains into components, I’m then able to create architectural definitions for each domain, which I will then tailor for this particular implementation. I also added component interfaces, which will provide the means for describing the interaction between the domains in the execution of the ADM.

For each domain component an architectural definition would include.

  • The role(s) that are responsible for implementing the domain components capability
  • The interface definition in terms of the required artifact data.
  • The responsibility of the domain component which is mapped to the ADM.
  • A mapping of the architectural capability being delivered by the domain component.

By providing these static viewpoints I now have the ability to scope the work that is will be required for developing new architectural capabilities. What make this a value added exercise is this process of moving from the descriptive aspects of the framework to a prescriptive application for implementing the artifact the will support the ADM.

This approach also has some ancillary benefits.

  • Establishing a basic model driven capability to the architectural practice.
  • Establishing the use of industry standard modelling notations.
  • The building of an enterprise repository of architecture assets.
  • Sets the foundation for building upon a concrete implementation of a tailored ADM.

ADM Component Context View

My next but not final step is to create an artifact dependency viewpoint. I will use this model to reason about the artifacts as deliverables in terms of component interfaces. The essential attribute of this model is the design by contract approach. The intent is to use the role that is encapsulated by the domain component as the implementation for the interface. This is a bottom up design which enables an agile approach to tailoring what is “just good enough” for each deliverable relative to the domains interface. The implementation can then be determined by the capability of the role, and the scope of the work for a sprint. In performing this exercise I now realize the architectural component definitions from the previous activity in terms of role(s) and responsibilities. Which then delivers several outputs for the ADM.

  • Identify the key or values added deliverables for each ADM domain.
  • Provide input to a RACI matrix for the underlying ADM process.
  • Identify the skill level that will be needed for the implementation.
  • Tailor the scope of deliverables and the process.

Deliverable Dependency View

Conclusion

In using some relatively straightforward and simple object oriented analysis techniques. I believe that one can reasonably create a concise description of a complex rich and mature framework and process such as the TOGAF ADM.  And in this proposed solution I am “dogfooding” the same approach for building toward a solution architecture that delivers a business capability.

It also seems to me, that this sort of exercise can also yield significant value at several other levels of architectural development. For example, by decomposing the high-level requirements into User Stories and adding a process model. A current architectural practice can augmented or even assessed for improvement. Which demonstrates one way to integrate an EA capability with an agile project management methodology.

Cyber Intrusions are Rapidly Reaching a Tipping Point. Have You?

This is the first year I have felt more than a little uncomfortable shopping online and even handing over my credit card while in my favorite stores. I am one of the millions that have been affected by one of the many cyber-attacks that have occurred this year. It is very clear that cyber criminals are more organized and better equipped than ever before—and they continue to evolve their strategies in order to undermine even the strongest protections. You cannot turn on the TV and not hear about another breach of security somewhere in the world. Here are some startling statistics:

12 cyber-crime victims per secondcyber attack
1,400 is the average number of attacks on a single organizations over the course of a week
• The average Cyber threat goes undetected for 8 months
• The average cyber-attack can in the US cost an organization $11million USD
• A security breach can cost an organization millions in dollars, not to mention the effect on that organizations reputation
71% customers will switch banks due to fraud
46% customers are leaving/avoiding companies with a security breach

The level of angst circulating in business and government circles caused by huge financial losses from cyber intrusions suggests we are rapidly reaching a tipping point. Have you reached yours?  Do you feel like you have a complete picture of the cyber threats to your organization?   How effective have you been in determining your infrastructure weaknesses? Read the rest of this post »

Posted in News

Topic Publish – Subscribe Using IBM Integration Designer 8.5

Introduction:

Business Process Execution Language (BPEL) is a XML-based language used to define enterprise business processes within Web services. BPEL extends the Web services interaction model and enables it to support business transactions.

BPEL can be developed to perform multiple activities such as invoking a web service, publishing a message to topic, subscribe for messages from topic, posting message into queue and consuming messages from queue. Below are the steps to publish messages to Topic and consume messages from Topic.

WebSphere Application Server Configurations:

Create Topic Space

  • Click on “Buses” under “Service Integration” section in left panel. Then click on “BPM.ProcessServer.Bus”

01_IIDPubSub

Read the rest of this post »

5 Tips to Adopting BPM Methodology

BPM Methodology and Principles:

The BPM Methodology is an iterative framework used to effectively analyze and re-design a business process with the goal of constant process improvement. The methodology’s key objective is to foster communication between business and IT in order to establish an optimal business process. Here are 5 tips to ensure that your business partners properly adopt the BPM methodology to improve their business operations.

1. Sell! Sell! Sell!

Every opportunity you get with your business partners will be an opportunity to sell the benefits of adopting the BPM methodology. A client new to the principles of the BPM methodology might find the ideas foreign and might possibly show some initial resistance. It is important to educate and demonstrate how accepting these principles will positively impact not only business/technology operations, but also the corporate culture. Embracing a philosophy of change enables your business partners to avoid common pitfalls that lead to failed BPM projects and ultimately poor BPM adoption.

Each checkpoint during the project lifecycle should address specific business problems. You should demonstrate how the methodology’s effectiveness enabled a resolution. It is essential to illustrate to your business partners how the BPM methodology has and will continue to strengthen corporate initiatives such as product quality, customer satisfaction, and communication between business and IT.

2. Find a Champion.

The subject of change is always a tricky one. As mentioned earlier, you might receive some resistance to adopting this methodology. You will find it easier to get buy-in from your business partners if someone within the corporate structure supports you. It is essential to have a counterpart who can also promote the benefits of the methodology. Your champion does not necessarily have to be someone at the top of the food chain in the corporate structure, but it should be someone who has some influence with the other project participants.

3. Avoid Old Habits.

You will likely run into issues and roadblocks during your implementation process. When these situations arise, it is critical to continually use the principles of the BPM methodology to reach a resolution. Your business partners might find it prudent to use other techniques used in past projects to overcome these roadblocks. Stay the course!

4. Use the Right Tools.

To ensure the successful implementation of your BPM project, it is vital to use the right tools to support your project. From a project management standpoint, the iterative approach to the BPM methodology will not be compatible with the tools used for a waterfall project. Using tools that can handle an iterative project cycle will help clients better understand deliverable and will put their expectation in perspective.

5. Be Patient.

During the process, you will undoubtedly be challenged. Whether it’s resistance from business partners, impending deadlines, or scope of your deliverables, you must remain patient and focused on your implementation methodology.

Building an ESB Capability

Building ESB Capability Java EE -vs- Configuring a Datapower SOA Appliance

Implementing a Java network infrastructure solution versus network appliance configuration

It’s not unusual for a seasoned Java implementer, when exposed to an IBM Datapower appliance for the first time to question the technological advantage of a configurable network device. I feel this question is best examined from an application architecture perspective.

Fundamentally, every implementation is the realization of a prescribed software architecture pattern and approach. From this viewpoint I’ll use a lightweight architectural tradeoff analysis technique to analyze the suitability of a particular implementation from the perspective of two technology stacks. The Java Spring framework combined with Spring Integration extensions and the IBM DataPower SOA appliance.

In this tradeoff analysis I will show the advantage of rapidly building and extending a bus capability using a configurable platform technology, versus Spring application framework components and the inversion of control container.

High-Level Requirements

The generic Use Case scenario: receive an XML message over a http, transform the XML input message  into SOAP/XML format, and deliver the payload to a client over an MQ channel.

Proposed Solution Architecture

Solution 1

Using an EIP pattern to provide a conceptual architecture and context lets consider the following ESB type capability. This solution calls for a message gateway, message format translator, and a channel adapter.

Assumptions

  1. The initial release will not address the supplemental requirements, such as logging, persistent message delivery and error back-out.
  2. This next release will be extended to include a data access feature, as well as the supplemental requirements.
  3. Message end-points, message formats, queue configurations, database access and stored procedure definitions have all been fully documented for this development life-cycle sprint.

Architectural Definition

  • To receive messages over HTTP you need to use an HTTP Inbound Channel Adapter or Gateway.
  • The Channel Adapter component is an endpoint that connects a Message Channel to some other system or transport.
    • Channel Adapters may be either inbound or outbound.
  • The Message Transformer is responsible for converting a message’s content or structure and returning or forwarding the modified message.
  • IBM MQ 7.x has been supplied as part of the messaging infrastructure capability.

Technology Stack Requirements

Spring / Java SE Technical Reference – Standards Information Base

  • Spring 4.0.x
  • Java SE 6 or 7
  • Spring Extension: Spring Integration Framework 4.1.x
  • Spring Extension: XML support for Spring Integration
  • Apache tomcat 7.x.x
  • Spring run-time execution environment (IoC container)
  • Eclipse for Spring IDE  Indigo 3.7 / Maven

DataPower XI/XG Appliance Technical Reference – Standards Information Base

  • Configurable Multi-protocol gateway (XG45 7198 or or XI52 – 7199)
  • XSLT editor- XMLSpy (Optional)
  • Eclipse for Spring IDE  Indigo 3.7 (Optional)

Architecture Tradeoff – Analysis Criteria  

For the application architectural analysis I will use following architecture “illities”

  • Development velocity
    • In terms of code base, development task, unit testing.

Development Velocity Analysis – Design and Implementation Estimates

Assumptions

  1. Development environments, Unit Test cases / tools, have been factored into the estimates.
  2. Run-time environments must be fully provisioned
  3. Estimates based on 6.5 hour work day
  4. 2 development resources for the implementation (1 Development Lead and 1 Developer)

Java SE using Spring Framework and Spring Integration Extensions.

Java EE Spring Framework
Architecture Component Design Component(s) Development Task Effort / Hr.
Message Gateway Http Inbound Gateway XML wiring http Inbound Adapter 6.5
Http Namespace Support XML wiring of Spring Component 6.5
Timeout Handling XML wiring of Spring Component 6.5
Http Server Apache / Jetty Build Web Server instance 12
Exception Handling Error Handling XML wiring of Spring Component 12
Message Transformer XsltPayloadTransformer XML wiring of Spring Component 13
Transformation Templates Build XML Transformation Template 12
Results Transformer XML wiring of Spring Component 13
Chanel Adaptor (Direct Channel) XML wiring Outbound Gateway 2.5
Build Attribute Reference File 12
Estimation hrs
96
Estimation of Duration (Days) 15

DataPower SOA appliance with standard configuration components.

DataPower Appliance
Architecture Component Design Component(s) Development Task Effort / Hr.
Message Gateway Multi-protocol Gateway Name and Configure MPG 3
XML Manager Name and Configure XML Manager
Message Transformer Multi-Step Transform Action Build XSLT Transformation Code 13
Chanel Adapter (Direct Channel) MQ Manager Object Name and Configure MQ Manager 2
Estimation 18
Estimation of Duration (Days) 3

Architecture – Architectural Tradeoff Analysis

In terms of development velocity a DataPower implementation requires approximately 70% less effort. This is primarily due to DataPowers’ Service Component Architecture design and the forms based WebGUI tool that is used to enable configuration features and input required parameters for the service components.

DataPower Services

The Java development velocity may be improved by adding development resources to Java implementation, however this will increase development cost and complexity to the overall project. Efforts around XML transformations are for the most part equal, the Spring framework and DataPower will use XSLT templates to implement this functionality.

Use Case Description for next release

In the next development iteration, our new Use Case calls for additional data from a legacy business application. Additionally, a supplemental requirement for persistent messaging with MQ Backout for undelivered messages on the channel.

Extended Solution Architecture

Solution 2

Development Extension Analysis – Design and Implementation Estimates

Assumptions

  1. Message end-points, message formats, queue configurations, database access and stored procedures have all been defined documented for the development life-cycle.

Architectural Definition

  • Must access a stored procedure from legacy relational database.
  • Must support Message Channel to which errors can be sent for processing.

Java SE using Spring Framework and Spring Integration Extensions

Java EE Spring Framework
Architecture Component Design Component(s) Development Task Effort / Hr.
SQL Data Access JDBC Message Store XML wiring of Spring Component 6.5
Stored Procedure Inbound XML wiring of Spring Component 8
Configuration Attributes XML wiring of Spring Component 3
Stored Procedure parameters XML wiring of Spring Component 3
Process SQL Validation/Processing  of SQL DataSet 9
Estimation 28.5
Estimation of Duration (Days) 5

DataPower SOA appliance with standard configuration components

DataPower Appliance
Architecture Component Design Component(s) Development Task Effort / Hr.
SQL Data Access SQL Resource Manager Configure Db Resource 2
Process SQL XSLT Transformer – Database Build XSLT Transformation Code 10
Estimation 12
Estimation of Duration (Days) 2

Architecture Tradeoff – Analysis Criteria  

For the application architectural analysis I will use following architecture “illities”

  • Extensibility
    • Adding persistent messaging on the channel with back-out functionality.
    • Adding data access and stored procedure execution from legacy database.

Architecture – Architectural Tradeoff Analysis

In terms of development extensibility the DataPower implementation requires approximately 50% less effort. This is primarily because, extending DataPower for these new requirements will not require additional programming for the data access functionality.

Again for this additional functionality the processing of the SQL stored procedure Dataset will require a programming effort for both implementations. The primary difference for Spring is the addition of 3 new components versus the configuration of a database access component on the DataPower appliance.

In terms of adding persistent messaging with back-out functionally. DataPowers’ built-in queue management service requires the implementer to enter the defined queue parameters. This is a net zero programming capability.

Conclusion

Undoubtedly the Spring framework along with Spring integration and the inversion of control (IoC) container, provides the Java developer with powerful application framework with functions that are essential in messaging or event-driven architectures.

However, the DataPower appliance offers this functionality as a purpose built non-disruptive network device out-of-the-box. In short DataPower is the concrete implementation of much of what Spring and the Integration Framework offers programmatically.

As cross-cutting concerns and non-functional requirements around security and webservice integration emerge the configuration capability of the appliance will become even more apparent.

ODM Series 1: IBM ODM Best Practices – The Performance

1. The Performance Cost

The performance cost for a Decision Service may look something like:

PerCost

 

 

 

 

 

 

 

 

 

 

2. The eXecutable Object Model and the RuleFlow

XOM type choices:

  • JAVA XOM better performance
  • XML XOM
    • Dynamicity
    • Useful in case of XML model

Ruleflow:

  • Limit the size and the complexity of the ruleflow, it is interpreted.
  • Always use the same engine algorithm to save memory.

3. The Engine Algorithm

Choose the correct engine algorithm depending on your Decision Service.

RetePlus (The default mode)

  • Stateful application
  • Rule chaining application
  • May be useful in the case of many objects

Sequential

  • Application with many rules and few objects
  • Most of the customer cases.
  • Really efficient in multi-thread environment.

Fastpath

  • Application with rules implementing a decision structure and many objects.
  • May have longer compilation but faster at run time.
  • Really efficient in multi-thread environment.

4. Decision Server Rules Tuning

  • The log level in the Rule Execution Server should be set to level Severe or Warning in the production environment to increase performance.
    • This property (TraceLevel) is accessible in the resource adaptor of the Rule Execution Server or in the ra.xml.
  • Tune the GC and memory size.
    • Starting configuration 64bits
    • -Xgcpolicy:gencon –Xmn2048M –Xmx4096M –Xms4096M
  • Tune the RES pool size.
    • A sizing methodology is available at: http://www-01.ibm.com/support/docview.wss?uid=swg21400803

5. Impact of the Execution Trace

Trace

 

6. Impact of the XOM Type

XOMType

 

7. Remote Web Service call vs. Local call

WSvsLocal

8. Fastpath Algorithm vs. Sequential Algorithm

FaspathvsSequential

 

Posted in News

ODM Series 1: IBM ODM Best Practices – The ABRD

I. The Agile Business Rule Development Practices (ABRD)

The Agile Business Rule Development (ABRD) Methodology provides a framework that project teams may adapt to meet the needs of their specific business rules application project. The methodology supports the full rule lifecycle, from discovery to governance by using an ‘agile’, iterative approach. ABRD activities fall into five categories described below. Each of these activities is executed multiple times as the process is followed.

TimeCost

 

 

1. Harvesting

Harvesting

1. Rule Discovery: Harvest rules, using short workshop sessions

  • Divide the decision process in smaller chunks
  • Determine the inputs, the outputs, the error cases
  • Use concrete scenario and pull them through the rules

2. Rule Analysis: Understand and prepare rules for implementation

  • Refine rules to be atomic
  • Look for ambiguity, contradiction, incompleteness or redundancy
  • Reconcile the rules with the object model (term-fact modeling)
  • Identify rule patterns, and rule dependencies
  • Define test scenarios against the object model
  • Assess: Rule volatility, and Rule sharing opportunity

Tools: Documentation          Roles: SME, BA

 

 2. Prototyping

Prototyping

1. Rule Authoring Early Stage – Rule Design

  • Define rule set
  • Define the BOM
  • Define the project structure
  • Prototype rules

2. Rule Authoring

  • Develop rules
  • Develop unit tests

Tools: Documentation, Rule Designer          Roles: SME, BA, Developer

 

ABRD

 3. Building

1. Rule Validation

  • Develop functional tests
  • Involve SME for feedback

Tools: Rule Designer, DVS          Roles: SME, BA, Developer

 

Building

 

 4. Integrating

1. Rule Deployment

  • Use Rule Execution Server staging platform

Tools: DVS, Decision Center          Roles: SME, Developer

Integrating

 

5. Enhancing

Tools: DVS, Decision Center           Roles: SME, Developer

Enhancing

 

 

II. Rules atomicity, patterns and dependencies

1. Rules atomicity

Atomic rules

  • Cannot be simplified without loosing meaning
  • Conjunction of conditions resulting in a single action

Atomic

2. Rules patterns

Rule pattern analysis helps to:

  • Select the right rule artifact (action rule, decision table, …)
  • Structure rules in packages and articulate the rule flow
  • Create rule templates

Table

3. Rule dependency

Rule dependency analysis helps to:

  • Structure rules in packages and articulate the rule flow

Dependency

WebSphere Portal-Custom Impersonation Portlet Invoked from Themes

This blog provides a different approach on implementing impersonation in portal applications. Impersonation, as we know, is a Portlet service, which lets the user (A) access the portal application as another user (B) by logging in as him or her (B). Out-Of-Box Impersonation Portlet provided by WebSphere Portal lacks flexibility and customization features specific to the requirement of the application.

There are 2 steps in implementing our custom impersonation Portlet:

i)  Creating a Portlet and implementing impersonation in action phase:

The following snippet use Sprint Portlet MVC annotations. Impersonation Service provided WebSphere Portal has two impersonate method. We use the one whose parameters are PortletRequest, PortletResponse and userDN. The first two parameters can be obtained in the action phase while the userDN we get it from the LDAP by passing the userID (of the user whom we are going to impersonate) using PUMA services.

Code1

Read the rest of this post »

WebSphere Commerce: Data Load for Custom Table

The Data Load utility is the new enhanced business object based loading utility that provides an efficient solution for loading into your WebSphere Commerce database.  Today I wanted to share a brief tutorial on the two ways you can load data using  Business Object Mediator and Table Object Mediator and how Catalog, Inventory, Price, Catalog Filter, Member, Location and Commerce Composer related data can be loaded with out of the box mappings using business object mediator.

Example 1:   Let’s say you have a table which a contains catalog entry id and some sort of code for each catentryid:

Database Table:

CREATE TABLE XCATENTRYCODE(
CATENTRY_ID BIGINT NOT NULL,
CODE VARCHAR(20),
OPTCOUNTER SMALLINT,
CONSTRAINT XCATIDCD1_PK PRIMARY KEY(CATENTRY_ID),
CONSTRAINT XCATIDCD1_FK FOREIGN KEY (CATENTRY_ID) REFERENCES CATENTRY(CATENTRY_ID));

The Data Load utility is the new enhanced business object based loading utility that provides an efficient solution for loading into your WebSphere Commerce database.  Today I wanted to share a brief tutorial on the two ways you can load data using Business Object Mediator and Table Object Mediator and how Catalog, Inventory, Price, Catalog Filter, Member, Location and Commerce Composer related data can be loaded with out of the box mappings using business object mediator.

Example 2:  Let’s say you have a table which contains a catalog entry id and some sort of code for each catentryid:

Database Table:

CREATE TABLE XCATENTRYCODE(
CATENTRY_ID BIGINT NOT NULL,
CODE VARCHAR(20),
OPTCOUNTER SMALLINT,
CONSTRAINT XCATIDCD1_PK PRIMARY KEY(CATENTRY_ID),
CONSTRAINT XCATIDCD1_FK FOREIGN KEY (CATENTRY_ID) REFERENCES CATENTRY(CATENTRY_ID));

If you notice the table, it contains two columns. CATENTRY_ID which is primary key and foreign key of CATENTRY.CATENTRY_ID. The source system sent a CSV file in the following format:

DailyCatentryCode.csv

PartNumber,Code

Now we have a data file and table to which this data goes. This is how we map the data. XCATENTRYCODE.CODE mapped to CSV file “Code” column.  Before going with the data load process, we need to understand the point that we have only a part number and using this, we get CATENTRY_ID from the CATENTRY table. It is possible to get, if we have a unique index.  CATENTRY has only one unique index i.e PARTNUMBER + MEMBER_ID.
So far we have table XCATENTRYCODE (Destination) , CSV file (Source) and mapping between source and destination:

wc-loader-code.xml
<_config:DataloadBusinessObjectConfiguration
xmlns:xsi=”http://www.w3.org/2001/XMLSchema-instance”
xsi:schemaLocation=”http://www.ibm.com/xmlns/prod/commerce/foundation/config ../../../../xml/config/xsd/wc-dataload-businessobject.xsd”
xmlns:_config=”http://www.ibm.com/xmlns/prod/commerce/foundation/config”>

Now let’s go through the following mapping:

We are going to load CATENTRY_ID column with CATENTRY_ID that can be retrieved from IDResolve.  This IDResolve declared to use CATENTRY table, which means CATENTRY_ID will be retrieved from the CATENTRY table and not to generate a new key.  The next two columns (UniqueIndexColumn) are the unique index columns. PARTNUMBER is mapped with CSV PartNumber.  MEMBER_ID is retrieved from BusinessContext. In the business Context storeOwnerId is nothing but MEMBER_ID.  Now the mappings are completed.

wc-dataload-code.xml
<_config:DataLoadConfiguration
xmlns:xsi=”http://www.w3.org/2001/XMLSchema-instance”
xsi:schemaLocation=”http://www.ibm.com/xmlns/prod/commerce/foundation/config ../../../../xml/config/xsd/wc-dataload.xsd”
xmlns:_config=”http://www.ibm.com/xmlns/prod/commerce/foundation/config”>

Finally, ensure you have CSV in the right location and we are ready to load data.  Just one of the many examples of this great utility!   There are some other great tutorials and videos on this topic I would recommend you check out.  Introduction to the data load utility: YouTube video – http://youtu.be/jCVOwqH0Rhw?list=PLhNYtwk4oIbdCR_-Gj_trKzApmuom5wfK

 

 

 

Tags: ,

Posted in Commerce, News

Upcoming Webinar: Compensation Management for Financial Services

Financial services and banking organizations are challenged with aligning sales performance with corporate goals to drive business growth. In addition to financial performance, one of the largest challenges financial institutions face today is managing the balance of meeting regulatory requirements without potentially disrupting performance.

The following includes a high-level look at some regulatory guidelines on compensation for financial services organizations:

ICM FS Blog Image 1

To help your organization manage performance and risk data against regulatory reporting requirements, you need to develop an enterprise-wide governance structure to gain control over sales channel compensation programs. Join us for an upcoming webinar on October 21, 2014, Increase Financial Firms’ Sales Performance & Compliance with Compensation Management, where our experts will cover:

  • Challenges around sales performance, Dodd-Frank and compensation governance in financial services
  • Industry-focused use cases and best practices for sales performance management solutions
  • Case studies of leading financial institutions implementing sales performance and compensation management

You’ll also learn how IBM Cognos Incentive Compensation Management enables organizations to achieve operational efficiency and reporting accuracy, greater data transparency, reduced risk and detailed sales performance analytics.

ICM FS Blog Image 2

To register for the webinar, click here.
Increase Financial Firms’ Sales Performance & Compliance with Compensation Management
Tuesday, October 21, 2014
1:00pm CT