Perficient IBM blog

Subscribe to Emails

Subscribe to RSS feed

Archives

Follow IBM Technologies on Pinterest

Archive for the ‘DB2’ Category

Manage data better with IBM InfoSphere

IBM states, “The InfoSphere Platform provides all the foundational building blocks of trusted information, including data integration, data warehousing, master data management, big data and information governance.[1]”  They cannot be anymore right!  Managing data has been more complex through the years as companies are trying to bring in more data (Big Data) for their business needs.  But the biggest hurdle in managing data is the how, and what tools can help lay the proper foundation in managing data.

Manage data better with IBM InfoSphereNow what is the proper foundation you may ask?  Well of course we all know this is to be the solution that is put in place that meets our clients our company needs.  But before a solution can be implemented correctly, you will need the right team and tools to take on this initiative.  So what is the right tool to handle and mange your data needs?  My personal opinion is IBM’s InfoSphere platform and within this blog posting I will list out a few reasons why.

One great product of the InfoSphere platform is InfoSphere Information Analyzer.  In 2012 I was working with client who wanted to initiate a SAP data governance and data-cleansing project for the procurement team.  I remember when I arrived on the first day of the project the tech lead and business SME were on vacation for the week.  This was perfect as it allowed me to work with the infrastructure team to install InfoSphere suite (DB2, Information Server 8.5, Information Analyzer, DataStage 8.5, QualityStage 8.5, Business Glossary and etc) on the clients network.  Once the install was complete, I wanted to show some true value while the client was away.  Again, not knowing the major data issues besides the high level SOW statement “Data-cleanings needed” I utilized IBM’s Information Analyzer and ran some information analyzer reports such as “Column Analysis” on KNA1, MARA, LFA1 tables to first do some quick reports on possible duplicate data, counts, data type details and etc.  I remember when the client’s tech lead and business SME came back I provided them the reports and they were blown away!  I urge you to check out the following video, “InfoSphere Information Server: Create a new Information Analyzer Analysis Engine” created by IBM’s education group.

Read the rest of this post »

Mobile-Big Data-Predictive Analytics-Social Media & the US Open

So what do all of these technology solutions have to do with the US Open? Behind the scenes, IBM has helped run the show since 1990. A recent online article shows how IBM is using these tools to flow information far beyond match scores. Handheld devices used courtside feed multiple data points such as ball speeds from each match into the system, where they hit a database that’s accessible to announcers broadcasting the US Open and IBMgames, and reporters. The system also stores historical data, allowing fans and media to compare players based on previous performance and shows head-to-head player matchups, historical video with social media for a richer experience for the tennis fans. The platform also pulls in social data, gauging the volume of posts about certain players and matches, then uses predictive analytics to estimate the potential interest in programming around them.

Read - Inside the IBM-powered Command Center at the Annual Tennis Mecca

Cognos Version 8-8.4 No Longer Supported! Now What?

Effective September 30th If your organization is currently running IBM Cognos Software version 8.0-8.4 these versions will soon be retired and no longer maintained by IBM. This means your production issues and bug fixes will no longer be supported for these following versions:C10 Upgrade Lit

  • Cognos Business Intelligence V8.4.1
  • Cognos Data Manager V8.4.1
  • Cognos Business Intelligence Analysis V8.4.1
  • Cognos Business Intelligence Reporting V8.4.1
  • Cognos Mobile V8.4.1
  • Cognos Analysis for Microsoft Excel V8.4.1
  • Cognos Metrics Manager V8.4.1
  • Cognos Planning V8.4.1
  • Cognos Business Intelligence PowerPlay V8.4.1

The replacements for these versions is IBM Cognos V10 and IBM Cognos Planning V10. For organizations looking to fully benefit from the most current architecture and functionality of IBM Cognos V10, the process requires a thorough understanding of the differences between the two platforms and consideration of the strategy and plans for your upgrade.

Typical questions that arise regarding Cognos8 to V10 migration are:

  • Based on your current version what are your license costs and discounts?
  • Which applications should you upgrade first?
  • Is your upgrade plan documented and aligned with business priorities?
  • Are you taking advantage of lessons learned from other V10 upgrade projects?
  • Do you have an issues, resolution, and go-live support plan?

Don’t put your Cognos investment at risk! Learn how Perficient can help you successfully migrate to Cognos V10 and benefit from the most current architecture and functionality. Perficient has developed our Cognos10 Migration QuickStart, which is a services and software bundle designed to get you up and running on your most essential applications. Then build a plan to migrate the rest of our Cognos8 applications. Find out if our QuickStart meets your needs by scheduling a Cognos10 migration assessment. The assessment provides preliminary insight into your upgrade readiness and essential input for planning your upgrade. The assessment delivers:

  • Insights into upgrade complexity and readiness.
  • Upgrade options and alternatives.
  • Determines your business priorities and the most appropriate upgrade path for your organization.
  • A high-level estimate to determine the investments in time and resources needed to support your upgrade.

Request our Cognos10 Migration QuickStart Solution Brief


Addressing critical healthcare transformation issues

Americans spend more on healthcare than the citizens of any other industrialized country, whether it’s measured in total spending, per capita spending, or as a percentage of GDP. And the rate of growth is similarly remarkable. By 2016, healthcare spending in the U.S. is expected to double from 2 to 4 trillion dollars which equates to over $12,600 for every man, woman, and child in the country and a whopping 20% of gross domestic product.

For many provider organizations it would be tempting to look at this projection and see business opportunity. And while a growing market certainly can create opportunities, it is also a cause of great concern for healthcare executives who know that this runaway spending cannot continue unchecked forever. Insurance carriers, patients and employers are demanding greater accountability and demonstration of value for their healthcare dollar, and these demands will only intensify in the future.
HC Analytics Workflow
The Perficient IBM National Business Unit is working directly with our partner IBM and their Healthcare Industry subject matter experts to meet the challenges providers are facing as the industry by helping them to establish a culture of performance management; that is, an ongoing process by which organizations define their strategic direction and then deploy their business processes and management systems in support of that direction. This requires establishing performance expectations through the planning and budgeting process, and then continually monitoring performance against the plan, making adjustments as necessary.

This process requires not only sophisticated tools and infrastructure but a strong commitment on the part of leaders to develop a performance-oriented culture. This is a special challenge for healthcare organizations that typically have a high degree of organizational complexity, diffuse lines of authority and accountability, and a culture of mission and community service that often is at odds with management discipline and rigor.

One specific area where Perficient’s IBM National Business Unit is helping Healthcare organizations is with the management of Key Performance Indicators (KPIs) and reporting. Providers often face challenges meeting mandated reporting requirements such as:

  • A comprehensive platform to collect and analyze these measures
  • Capital or skills to acquire decision support and analytics capabilities
  • Fragmented information across systems that lack appropriate data quality
  • Dependency on manual, labor-intensive processes to pull reports in a timely manner
  • Limited visibility to information and processes to manage performance across functional boundaries

The Perficient Health Analytics QuickStart is solution built-on industry approved and tested IBM Analytics platform. The seamless integration and testing across solution layers simplifies deployment and delivers results faster. The pricing simplifies licensing and project approvals. The included implementation services, free Healthcare organizations to focus on a well-defined goals – not IT issues. Customers can see real results in less than six months. The Health Analytics Quick Start is a proven, complete analytics solution that delivers fast time to value and provides a robust foundation for implementing broader analytics in the future.

HC Analytics ArchThe powerful capabilities of InfoSphere DataStage and IBM Cognos 10 offer a resilient, secure platform for healthcare applications. With IBM PureData System for Analytics and the healthcare provider data warehouse, you will have two components of the IBM big data platform. From there you can easily add big data capabilities to capture and analyze valuable unstructured and streaming data to further enrich your analytics initiatives.


The Health Analytics Quick Start solution is built around key solution deliverables to
quickly get you up and running:

  • Dashboards: Track clinical care measures and gaps in care across populations, track and benchmark performance across the organization.
  • IBM Cognos 10: Business intelligence report authoring and a browser-based report viewer.
  • PureData System for Analytics: A comprehensive data warehouse platform that offers simple deployment, out-of-the-box optimization, no tuning and minimal ongoing maintenance.
  • Perficient Healthcare Provider Data Model: Logical data models and business solution templates that help healthcare providers build and deploy reliable, accurate health analytics.
  • IBM InfoSphere DataStage: A powerful ETL tool that supports the collection, integration and transformation of large volumes of data, with data structures ranging from simple to highly complex.
  • AIX Solution Edition for Cognos: An IBM Power 740 server is optimized to deliver superior performance for IBM Cognos, reduce risk and deliver high quality of service.

Perficient provides healthcare organizations with the tools, techniques, and advice they need to develop a culture of performance management. Watch this short video below and learn more about the work the Peficient IBM National Business Unit is doing in Healthcare with IBM and see a demo of our Healthcare Analytics QuickStart.

Business Insight Requires Vision and Analytics

IBM-Bus-Analy-Sol-Info-Man-2012

Perficient’s award-wining IBM Business Analytics practice is a Gold sponsor of Vision 2013. During the conference our subject matter and industry experts will be on-hand to discuss how Perficient helps our clients leverage accurate, timely and integrated information to transform information into actionable intelligence that provides insight, drives planning and improves performance. Our experience with IBM’s Analytical solutions, combined with our industry knowledge, help organizations improve decision making and become more agile.

DOWNLOAD OUR VISION 2013 SOLUTION OFFERING BRIEF

Perficient delivers the following benefits when deploying solutions to the enterprise:Competency_2012

  • Cross IBM integration with IBM Smarter Commerce, Social Business and WebSphere
  • Performance & Analytics Strategy and Roadmaps
  • Financial Statement Reporting and Consolidations
  • Management Reporting
  • Planning, Budgeting and Forecasting
  • Master Data Management
  • Data Integration
  • Software Support and Renewal Sales and Services
  • Training and Mentoring

Attending Vision 2013 in Orlando? Find out what our clients have to say about their Analytics projects

Attend Perficient’s Vision Breakout Sessions

Cooking up Savings with Cognos
Speaker: Tim Dungan, Lone Star Steakhouse|Texas Land & Cattle SteakHouse| Firefly Kitchen & Bar
Abstract: In this session you’ll learn how the finance team at Macaroni Grill incorporated operational data to meet the needs of its line-of-business leaders and bridge the divide between finance and operations.
When: Tue, 21/May, 04:20 PM – 05:20 PM
Where: JW Marriott – Segura 5

Read the rest of this post »

Learn How Mission Health is Using Data to Improve Quality of Care

With Perficient’s help Mission Health is leveraging IBM’s master data management (MDM) to build a scalable, accurate foundation around its most critical data to support critical business processes across the enterprise – information about patients, providers, facilities, organizations, employees and more.

Watch this webcast and learn how Mission Health is using Big Data and Analytics to improve quality of care, meet new regulatory compliance standards and manage payment reform. Learn how you can establish an accurate, trusted view of your most critical information assets.

  • Understand how MDM provides a 360 degree view of patients across the health system

  • Connect disparate registration, ambulatory, and clinical systems to patient records

  • How MDM impacts initiatives such as patient domain, provider domain, analytics programs, Big Data and more

  • The role MDM plays in addressing Meaningful Use and Accountable Care compliance objectives

Mission Health Vid

Free-Hadoop for Dummies An IBM Platform Computing guide

Hadoop for Dummies is now available!

This free eBook is packed with everything you need to know about Hadoop analytics, this handy guide provides you with a solid understanding of the critical big data concepts and trends, and suggests ways for you to revolutionize your business operations through the implementation of cost-effective, high performance Hadoop technology.

In the age of “big data,” it’s essential for any organization to know how to analyze and manage their ever increasing stores of information.

Access the eBook HERE

In addition to the free eBook you can also access Information Weeks most recent Big Data research report. The report includes insights from surveys of over 200 CIOs in North America.

Access the Information Week Big Data report HERE

Big Data’s Challenges

Just returning from IBM’s Information On-Demand 2012 conference in Vegas last week, where there were just as many new questions created as ones that were answered. Among the usual Vegas question like; What happened to my money and Where am I, were some new ones. Most common was; What is Big Data? So it appears there’s still some market education needed to clarify the definition. Whenever that question arose, the conversation always seemed to come around ultimately to what the challenges are to managing high volume data streams. One Perficient client I spoke to, a large Southern California utility, is dealing with massive influx of new data streams from their Smart Meter/Grid deployment projects. As we talked I was struck by how immense the volume of information was, how much was being discarded, and how much potential their was for the data – Good & Bad. Clearly the challenges of big data are real. First off, definitions are as diverse as opinions. Most organizations don’t differentiate “big data” from traditional data. In fact, in recent study done by Information week nearly 90%of respondents surveyed use conventional databases as the primary means of handling data. With the help of the Information Week research, hopefully we can better understand what constitutes big data (it’s not just size) and the challenges it poses.

The Information Week survey revealed that the top big data sources were financial transactions, email, imaging data, Web logs, and Internet text and documents–all common data sources. It’s clear, you don’t need to be a massive utility company deploying smart grid technology to be inundated with huge volumes of data, and if it isn’t a challenge for you now, it will be very soon. Any business creating large data sets will need to imbed big data management practices and the right tools and architectures, or they won’t be able to effectively use the information collected.

So what is big data? It’s more than just volume. Generally four elements are required to qualify as big data. The first is the size; 30 TB is a good starting point. Second is type of data. Big data involves several types—structured, unstructured and semistructured. Third is latency. Big data changes fast and creates new data that needs to be analyzed quickly. Fourth is complexity. Characteristics of complex data include large single log files, sparse data and inconsistent data.

Now that we’re zeroing in on the definition and structure, or lack there of for these growing forms of data, the next question is do you have a strategy in place to deal with it differently then you deal with more traditional forms of data. According to Information Weeks research of over 200 technology leaders over half said “NO”, which likely means that if you’re reading this you probably don’t either. Don’t worry though, you’re not alone. 87% of respondents are still using databases as the primary method to handle data.

 

 

 

 

 

 

 

Complicating the management challenges to big data are the various approaches to managing the data based on sources and structure. The stream processing approach involves almost every aspect of computing, including processing ability, network throughput, storage and visualization. The majority of the Information Week survey participants expressed concerns about access to data, storage and analytics when it comes to this approach. Most were divided between those that need real-time processing of big data and those that don’t. Real-time processing can be a challenge with big data, especially in dynamic data environments. The batch processing approach  to big data is designed to manage information as it grows and expands over time. Organizations that deal with this type of data are turning to the Hadoop model and software to rapidly process significant amounts of data. Hadoop is being used for some very big implementations. According to Information Week, Facebook was the largest Hadoop deployment in the world with more than 20 PB of storage. By March of 2012, it had grown to 30 PB—3,000 times, the size of the Library of Congress. There are two problems in using Hadoop . First, you don’t get partial answers. You have to wait, sometimes a long time, for the entire batch to finish. Second, it can require a lot of hardware, because all data is processed at once. Which means any change in data requires the entire batch to be rerun. The only way to deal with this is to apply more hardware, which can be costly.

Besides the various management approaches and inconsistent market definitions, there are some other hurdles that companies should be on the look out for. According to Information Week’s research, almost half of the participants 44% indicated that they lacked the knowledge needed to implement and manage big data solutions. More than half 57% noted budget as the biggest barrier.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

With traditional forms of data management rapidly approaching capacity due to the deluge of new forms of information and sources, the market is approaching a looming crossroads. More and more business will be faced with a lack of knowledge resources needed to tap the vast wealth of information available to them. The tools are there and the information is clearly there for the taking. Organizations willing and able to invest in big data resources will eventually gain a greater competitive advantage over those that don’t.

With so much at stake with such a complex solution set, companies will be looking to eliminate as much risk as possible from these projects. Learning from peers, listening to analyst insights, understanding costs and accessing the best services, mentoring, and training solutions is a critical prerequisite to project and on-going management success of big data. Doing it right the first time means tapping into providers that have experience in a wide range of technology options for big data. Perficient’s diverse range of technology partnerships, and extensive training capacities are leading many earlier adopters to Trust us with their company’s most precious resource, information. Our industry experience and partnership awards are testaments to our delivery quality.

Several of the research points on big date made here are pulled from Information Week’s “Big Data Management Challenge” research report from April of 2012. The report is available for free and is an interesting read for anyone looking to understand more about big data.

Download the report

Performance Tuning in Integration Projects: Power of XML Queries

It is a common scenario in integration projects to retrieve messages from the database for message enhancements, grouping and transmission.

We encountered recently a performance issue in a big healthcare integration project. The technologies employed were IBM DB2, IBM Message Broker (MB), and IBM Transformation Extender (WTX).

The process of retrieving messages from the database was developed in ESQL using SQL queries to retrieve messages and then concatenate them in an ESQL loop, for passing to downstream process (WTX) for transformation and grouping.

Read the rest of this post »

Tags: , , ,

Posted in DB2, WMB, WTX

Message Broker – ESQL – Autonomous Transaction

Sometimes while writing transactional code in ESQL, we encounter need for committing certain database operations irrespective of success or failure of parent transaction. For example, writing information to a log; updating a sequence number in a table based on some business logic etc.

Recently we encountered such a situation in a healthcare project I was working, where a sequence based on table was used by multiple processes (batch and real-time) by incrementing and then updating the sequence with new number based on some business logic. Since batch process may take longer time to complete transaction than real-time, database locks were occurring on the common resource (Sequence table).

One way to avoid database locks in these situations are by committing after the UPDATE operation on this table. But this “commit” will also commits all the prior database changes (DML) from the start of the transaction. Which may not be suitable if transaction fails in  later steps.

Better way is to create autonomous transaction for those operations. But there is no autonomous transaction support in ESQL. Luckily DB2 has a mechanism, to commit the only part of the code (a single statement or operation) without committing parent transaction.

Autonomous Transaction in DB2 is handled by DB2 Procedure. Transfer the contention  database operations( statements) into DB2 Procedure and declare that procedure as AUTONOMOUS and  Commit inside the PROCEDURE . This way it doesn’t affect parent transaction.

Here is the link for more information on Autonomous Transaction in DB2.

http://www.ibm.com/developerworks/data/library/techarticle/dm-0907autonomoustransactions/index.html