Perficient IBM blog

Subscribe to Emails

Subscribe to RSS feed


Follow IBM Technologies on Pinterest

Trends in Banking and Financial Services

Advanced analytics has a home in banking and financial services, but it transcends the traditional use of Excel and OLAP to run trend-lines on revenue and cost as they affect the classic Profit and Loss statement.  While macro trends are important to the financial services professional, two more important challenges involve:

  1. understanding the customer at the Point of Sale
  2. signing up profitable customers to avoid further attrition and/or mitigate losses.

Furthermore, by virtue of the fact that new products attract a customer base of unknown quality,  many banks need the above type of information right at the product roll-out phase when the least info is available as the prospect turns into a new customer.  With SPSS analytics an analyst can develop a holistic view of customers that purchase two types of products;

  1. Non credit based products like checking accounts and savings accounts
  2. Credit based products like credit cards and mortgages.

Key point, non credit based products like checking accounts behave more like “commodities” so the type of analytics most appropriate for this product set will resemble retail analytics (ie use of trend-lines, forecasting and customer profiling).  For credit-based products like credit card, the analytics is more rigorous and takes the shape of a two step process data mining process;

  1. A ‘quasi-underwriting’ step in which you pre-qualify prospects who might be at future risk for default based on key criteria like home ownership, bankruptcy history etc.
  2. A direct response model to make sure the ‘prospect universe’ left over from step 1 has a decent tendency to respond to marketing offers.

In fact, most credit card and mortgage products suffer from the adverse selection law of attraction (ie the customer an institution least wants to attract is most likely to respond to an offer of credit due to a lack of solicitations from the competition).  To combat this, analytics is necessary to micro-target good credit credit risks. Let’s say we have an existing portfolio of consumers with high usage of checking accounts from a recency, frequency and volume of usage perspective. Picture the customer on a 3-dimensional RFM grid – See diagram:




So basically, if we assume that all customers have the same default rate, we would market credit cards to these customers without doing doing additional advanced analytics to predict default rates.  But knowing this is not the case, we would have to overlay RFM with an additional step to find out where within the business rules associated with high-default risk lurk within the customer base (which would be a subset of the RFM grid shown above)

Our advanced analytics practice can provide more insight into handling these types of situations in the banking and financial services domain space. For more information, contact:

Tony Firmani

Director Advanced Analytics

Predicting Maintenance with Unstructured Data

The field of process control has changed radically over the last 20 years. Back in the day, it was not uncommon to view process control as simply the exercise of reading a time trend line to see if a defect rates were rising or falling based on outlier activity around the classic Shewhart control chart.   While this level of analytics is a good starting point, it by no means completes the picture.

SPSS Text Mining with unstructured data helps complete that picture.  If an operator of a machine or a production line performs routine maintenance on that entity, he or she will routinely collect information / comments in the form of inspection logs.  In turn those inspection logs might have a common word theme that predates an imminent part failure or machine outage. Furthermore, groups of they key words can be used for characterizing and prioritizing defects by type (ie the old Pareto analysis idea that 80% of your defects are clustered within 20% of your defect types).

The enclosed Flowchart gives some indication of how this text mining might work for an air conditioner manufacturing firm with noticeable defects that are logged at the inspection phase:

predicting maintenance

This is an improvement on the old Quality Control paradigm of merely tracking trendline and get us closer to the concept of root cause analytics for further corrective action.

For more information on this concept contact

Tony Firmani
Director, Advanced Analytics


Predictive Customer Intelligence

Chief Marketing executives who are tasked with growing a book business are faced with many challenges:

  1. Acquisition – What offers will increase direct response rates for prospects?
  2. Retention – What factors indicate whether a current is at risk for attrition?
  3. Activation – Once a new customer is sourced, how can we get them to use more products and services?
  4. Cross-Sell: Once a product is purchased what is the next best offer take serve to a customer?
  5. Up-Sell: What steps must be taken to put the customer on an upgrade path to a better product set?

PCI (Predictive Customer Intelligence) is uniquely tailored to B2B and B2C retailers interested in rounding out their view of their customer base to establish a set of demographics and offer characteristics designed to attract and retain new customers.  The beauty of this solution is that virtually any retailer can immediately benefit from this capabilities since the solution have very broad applicability.

Furthermore, Perficient brings to the table its own unique demographic-append methodology to round out sparse customers with zip-level cluster data to give more of a 360 degree of the customer for development of a complete profile.

To learn more about this solution, contact me

Tony Firmani

Director, Advanced Analytics



Posted in News

SPSS for Forecasting

Many of our clients are not aware of the fact that SPSS not only provides predictive modeling capabilities but it can also be used to build a highly accurate time trend forecast.
These time trend forecasts are useful as an augmentation to TM1 cubed trendlines for revenue and expenses. The key is that SPSS uses leading indicators and higher level math to fit peaks and valleys of the data (aka volatility) . Furthermore, SPSS can also be used to quantify forecast risk (ie scenarios under which the forecast under-predicts the actual or over-predicts the actual)

This kind of capability works well in those industries that are constantly tracking revenue and expense as a function of time, especially retail and manufacturing.

For more information on this topic please contact
Tony Firmani
Director, Advanced Analytics

Posted in News

Near Real Time Integration Pattern for Salesforce and SAP


Systems Used:

DataPower – XML Gateway Appliance to Authenticate and Validate the requests.
MQ – Queuing Mechanism used for Guaranteed Delivery.
Cast Iron – To develop integration’s using the In-built connectors available for Salesforce and SAP.

1.   Salesforce places transaction on Outbound Message Queue.  This transaction contains only the following:

  • The ID of the record
  • The Integration Operation (e.g. “Insert”)
  • The Integration Status (always “Submit”)

       The Outbound Message queue performs retries at a specified time interval until the message is delivered.

2.  Salesforce Outbound Message Queue invokes service on the External DataPower XML gateway.

  •     External DataPower gateway verifies the Salesforce generic SSL certificate and passes service request to the       Internal DataPower XML gateway

3.  Internal DataPower XML gateway places the transaction on MQ processing queue for that
transaction. The processing queue is identified by a generic web service proxy which will use the unique Object            name and Integration Operation field value combination from the incoming SFDC Outbound Message Request.

  •     Each transaction type has its own Processing and Error queues in MQ.
  •     All queues are First In – First Out (FIFO)

4.  Internal DataPower acknowledges request to Salesforce. Salesforce clears request from the Outbound Message            Queue.

5.  Cast Iron polls messages from MQ processing queue.

6.  Cast Iron initiates new synchronous call back to Salesforce with Transaction Record ID specified in Outbound               Message request.  This service call obtains the transaction payload from Salesforce to be provided to the back end       system.  This call back can be either SOAP for standard Salesforce services or REST for custom Salesforce                     Services.

7.  Cast Iron initiates a new synchronous transaction to DataPower Internal Web Service Proxy which will                          authenticate and validate the transaction request and routes it to the corresponding SAP Service.

8.  SAP executes the transaction and returns status and any return data.

9.  Cast Iron initiates new synchronous transaction to Salesforce to provide updated status message and any return           data and any application error message.

10.  Salesforce posts any data from the return message.

  •   Salesforce provides the return acknowledgement to Cast Iron.
  •   Cast Iron closes the session with Salesforce and closes the transaction.

When an error happens at anytime during the transaction, the Salesforce Outbound message is placed onto the corresponding error queue. Once the error is fixed, the message is transferred from the error queue to the processing queue. Cast Iron picks up the message again and completes the transaction successfully.

Supporting the TOGAF-ADM using a SharePoint EA Portal

After years of architectural engagements and having worked at a variety of companies there are a few things that I have found somewhat consistent when it comes to the practice of architecture. First architecture happens, regardless the maturity level of the organization. Secondly, at some point in the development life-cycle someone will produce some kind of box and line diagram to articulate the solution. Finally, there will be some sort of Word document describing the architecture, at some point of time. Generally, from a conceptual viewpoint to some degree of completeness.

How architecture happens is a by enlarge a function of the ceremony, rigor, capability, discipline, and standards of the organization. With that in mind, I’m confident in saying there are some organizational commonalities among the diversity of what is done from this list. Which would include the use of Visio, PowerPoint, Word, and generally a SharePoint site as a platform for collaboration.

In this article I’ll share a few thoughts on the challenges, pitfalls and potential of these business applications in the context of either an informal or formal architectural practice. In which various activities may be performed by an Enterprise Architecture group, a Center of Excellence, or even within a Solution Architecture team. Among the various opinions of what it means to do architecture. I’d believe that it’s reasonable to say that collaboration and communication of the solution to the concerned stakeholders is an essential capability. Regardless of the maturity and structure of the organization doing the work. Next would be the multifaceted modes and mechanism that are used for communication. Which requires more than superficial consideration, but rather a coherent tool strategy. As one of my mentors once said; Only in Disneyland will tools come alive and do your work.

To frame this article I’ve provided the following elaborate box-&-line conceptual diagram. Which is the contextual basis for forming the collaboration and communication machinery.

SP-System Context-White


An examination of this diagram depicts what is a relatively common system in many organizations. In general we would see:

  • Modeling Work Products – The Vision diagram often used to communicate an architecture on some level.
  • SharePoint – Which hosts several portal sites.
  • Document Library – A commonly used Web Part to contain work products and artifacts, such Word Visio and PowerPoint documents.
  • Enterprise Wiki Pages – Which provides the capability to subdivide the portal site by subject areas.

What may not be intuitively obvious in this viewpoint are the following:

  • Visio Data Linking with Excel or a Database  – Which enables Visio modeling elements to be link to an external  data source.
  • Hosted Kanban / SCRUM Project Management tool – Which is represented in SharePoint as an http link in a Web Part.

While this is a fundamentally sound system that is typical to many SharePoint centric collaboration environments. There are a few challenges and pitfalls.

First it is not uncommon to find that while these applications are used together, they are principally used independently and are often disconnected from an architecture development process, and more often the SDLC. While Visio is used to illustrate a range architectural viewpoint at various points in times throughout the development life-cycle, as well as creating content to communicate to business and technical audiences. With a fragmented tool integration approach, work products may be marginally aligned to the project methodology and artifacts.

In this scenario SharePoint often becomes a simple repository for the resulting models and artifacts, as well as a passive means for communication. Which by enlarge are PowerPoint presentations, Word documents, or simply the Visio diagrams themselves. The challenges in this widely utilized approach to collaboration are multifaceted. However, the primary pitfall is that the collaboration environment is disjointed and fails short of supporting a cohesive and consistent means of establishing a good architectural practice throughout the development life cycle. The underlying contributors to this situation.

1. Poorly structured repository makes it difficult to intuitively locate documents.
2. SharePoint becomes more of a container rather than a true content management system.
3. Documentation often becomes stale shortly into the development life cycle.
4. SharePoint Configuration Management is ignored, which is not necessarily intentional. But this results in the “what version is correct” conundrum.
5. The organizational maturity level is diminished by such a cumbersome document management solution. Efficient and effective communications are hampered.

Additionally, over the project life cycle the collaboration approach may be slowly abandoned, as it becomes less than real-time mode for sharing information. Eventually, growing into something more like an unorganized hallway closet. Stuff goes in, and while over time it may not become lost; it becomes mostly useful in supporting ad-hock queries. Getting architectural direction is more like an archeological dig. Rather than a proactive means for doing architecture. Architecture is about collaboration and communicating the architectural design throughout the development life-cycle. From the conceptual to concrete guidance and prescription of the implementation. A good analogy is that the architecture is like assembling a skeleton, once the bones are structured and the connections are defined. The architecture is fleshed out as implementation adds the sinew and tissue. Proclaiming that you know where the skeleton is buried is not necessarily a good practice.

This is where a comprehensive tools strategy for an integrated EA portal can address the aforementioned challenges. Using the systems context in Figure 1. As the foundation for the tools architecture. Members of the architecture team would implement the following foundational Use Case scenarios. Incidentally, the team is thereby fleshing out its own skeleton.

  • Build out the SharePoint site as a set of Wiki pages that follows an architectural development methodology, and tie it to the SDLC.
    • In this case the TOGAF ADM will be used.
  • Structure the Wiki page with a combination of Sub-pages, using SharePoint web parts such as Custom List, Document Libraries.
  • Build object models to flesh out the Custom List as the underlying information model for the Enterprise Architecture catalogs.
    • Again work products and catalogs from TOGAF are the basis for designing the information models.
  • Use SharePoint enterprise services for version management.
  • Embed project management into the architecture development method.
  • Use SharePoint enterprise services maintain the content of work products and artifacts.
  • Use Visio external data link capability tie the models to the underlying information models.

By implementing this design an architectural practice can expect not only to address the challenges and pitfalls of a fragmented collaboration environment; but also realize the following benefits:

  • A means to integrate the business of doing architecture into an agile SDLC.
  • Establishing a standardized approach to specifying architectural solutions.
  • A means of developing, maintaining and reusing architectural assets.
  • Developing a standards information base.
  • Providing architectural transparency throughout the enterprise with a means of providing a self-serve approach to communicating architectural decisions.

These are just some of the benefits to building out an organized SharePoint centric collaboration environment that is informed by a comprehensive tools strategy. In subsequent articles I will provide concrete examples of a fleshed out SharePoint portal that implemented the above-mentioned Use Case scenarios.

Perficient at IBM Vision 2015

Perficient is proud to be a sponsor of IBM Vision 2015, held May 17-20 in Orlando, FL. Vision is the premier business analytics and performance management conference for finance, risk management and sales compensation professionals.

The conference is only 2 months away, so if you haven’t already registered, click through the following link: If you register before May 6th, using the promotional code “SP15PER”, you’ll receive $100 off the rate.


We’re excited to announce two outstanding breakout sessions at the conference, where our customers will present on recent financial and performance management implementations:

Read the rest of this post »

How does Currency Volatility Affect Your Business?

In a blog post earlier this year, I discussed the precipitous drop in energy prices.  From roughly mid 2014 through early 2015, the price of crude oil was cut in half, and the related dominoes started to fall.  Companies were affected directly or indirectly by the new price level of oil.  Public and private entities across the global had to adjust plans and forecasts quickly.

Like Crude Oil, other markets have seen a significant deal of change in the last 6-9 months.  Currency Conversion is one area in particular where we spent a lot of time working with our customers.


The photo above is a 6 month chart of the Euro-US Dollar exchange rate.  6 months ago, 1 US Dollar was worth nearly 1.30 Euro.  6 months later, that same dollar was worth 1.08 Euro.  This represents a 15% strengthening in the US Dollar.  Many forecast the trend to continue due to Euro area quantitative easing.

A 15% move in the Euro-US Dollar exchange rate, in 6 months, is a meaningful change that must be accounted for, both in practice and in analytics.

Regardless of what strategies change as a result of foreign currency volatility, companies need to plan and perform analytics excluding foreign currency volatility.  If you reviewed IBM’s Q4 2014 earnings release, they described revenue, as they always do, in constant currency.  To witness other examples of the impact of exchange rate volatility, see large multi-national Q4 earnings reports from the global consumer product companies and learn how currency impacted them.

How will your business be affected if the Euro moves another 8% to parity with the US Dollar?

Perficient’s IBM Business Analytics practice has delivered budgeting and forecasting solutions to the world’s largest companies.  We have direct experience dealing with the topics covered in this post and we can bring these and other proven and repeatable best practices to new customers.  Please contact us to learn more.

IBM Aims High at Investor Day Meeting

Last week, during IBM’s Investor Day, CEO Ginni Rometty announced that IBM’s investment strategy would mirror and support its evolving go-to-market strategy, by funneling $4 billion into cloud, analytics/big data, mobile, social and security (CAMSS). Through this investment, IBM is looking to expand the footprint of its CAMSS offerings from $25 billion in revenue to $40 billion by 2018.

IBM Blog - InvestorThese investments are in addition to the groundbreaking achievements and acquisitions that IBM has executed over the past few years. Since 2010, IBM has built and commercialized Watson Analytics, acquired SoftLayer and data centers to expand its cloud offerings, partnered with Apple to improve its mobile capabilities, and successfully executed a Smarter Planet initiative that spans all CAMSS focus areas.

Perficient is excited about the ongoing and upcoming changes at IBM that reflect this CAMSS strategy. The focus aligns perfectly with what we’re seeing in the marketplace, as our clients are looking for technologies that help drive innovation and revenue, not just lower costs.

Tags: , , , , , ,

Posted in News

Wooing the Connected consumer ….@IBMInterConnect 2015

It was a great week at IBM Interconnect with so many great sessions to choose from, activities to participate in, and networking to be had. I want to share my thoughts on one session in particular that was of interest – Carter’s “Creating a Smarter Shopping Experience”.   During the session I was reflecting on all the retailers we are working with to build innovative technology solutions to woo the connected consumers of today and how shopping has evolved from park the car, pick from the shelves, check out and bring home. Shopping has also evolved from search online, add to cart, pay and wait for delivery. Rich pins on Pinterest, Likes on Facebook and hashtags on Twitter drive purchases of goods and services.

Retailers like Target, Carters, JCrew, Nordstrom, Starbucks etc. have begun to incorporate social networking channels into their campaigns. The consumer is now ‘connected’, ‘always-on’ and being bombarded by marketing campaigns from competitors all over the globe. Although, consumers still make the conventional decision of buying what they need and when they need  it, alternative business models are making their way front-and-center to reap the benefits of understanding the  consumer behavior. Thousands of analytics apps scour heaps of structured and unstructured data, to figure out what and when the consumer will buy and most importantly, what the consumer feels.

The ‘Connected’ Consumers defined:

  • They rely on social channels to guide purchases.Blogpost1
  • They expect personalized experience, tailored to their needs.
  • They want mobile alerts on promotions, special events and recommendations based on purchasing history.
  • They want a 360 degree view of the product, not only physical looks from every angle, but also technical specifications, reviews, market trends, price match and customer ratings at their fingertips.
  • They expect broad assortments, ‘endless aisles’ and competitive pricing for products of interest.
  • They need flexible order systems, where order should be received from any channel – store, online, mobile, call center and shipped from anywhere – manufacturer, distribution center or store.
  • They demand faster and flexible delivery and return options in hours, days, and weeks or scheduled for months later.
  • And…. all this via a wearable device, if you please!

Initiatives to make it work

Tight coupling of customer engagement systems, marketing technology and social tools enable better understanding of shopper behavior and patterns. Creation of social profiles in social CRM systems, like Salesforce, use of social listening and real – time insights with tools like IBM’s Consumer Insights could be put to work. It may require re-inventing email marketing programs and lead management.

Investment in data analytics to track consumers’ purchasing patterns, social media monitoring and sentiment analysis provides a sure shot payback. The adoption of Internet of Things has a huge impact on future business models where sensors in brick-and-mortar stores will help gather and process data. IBM’s InfoSphere BigInsights, Apache Cassandra and Hadoop facilitate scalable and affordable real-time analysis. Fast delivery can be achieved by an in depth understanding of getting the right items to the right locations at the right time. This implies that inventories are managed and updated in real time, order management and logistics systems are fully integrated and tuned to work in synchronization.


To extract value from mobile for the connected consumer the key factors for engagement are personalization, contextualization and geo-location. A robust mobile strategy that encompasses these factors will stand the test of constant digital disruption.

Web APIs are the magic words that bond cloud, social and mobile technologies to work in unison for a common goal. These lightweight software interfaces streamline access to product information, store inventory, location details and mashups of other APIs from various sources. All systems throughout Amazon communicate via APIs. Several other APIs from leading retailers like Target, eBay, Best Buy can be quickly consumed to spin up a new site.

Now wearable technology is the hottest consumer trend that needs to be taken heed of, when developing shopping experience. Data from wearable technology can be funneled into existing ecosystems for personalized shopping experience. Mobile-friendly websites reduce shopping card abandonment. Websites should be optimized for mobile devices and integrated with mobile couponing and mobile payments. Mobile enabled loyalty programs keep customers engaged and add value to brand interactions. Self-service kiosks, enabling technologies like mobile, Bluetooth low energy (BLE), clienteling software and retail apps are facilitating and further enhancing the retail experience, uncovering new revenue sources and enable one-to one marketing.

A seamless approach to customer experiences available through all shopping channels, a single view of shopper derived from personal profiles and a single database of products, prices and promotions is the Holy Grail, and the path is strewn with challenges. Legacy applications and business functions require transitioning to newer tools and applications.

With careful approach of building interfaces across projects in an incremental fashion, establishing integration standards and best practices for across projects, the possibilities are practically endless. I left InterConnect more determined than ever to get our teams engaged with top retailers, and all industries really, to keep them relevant, innovative, and competitive in their marketplace. Then I rested my weary feet after a week in Las Vegas. J