Skip to main content

Data & Intelligence

CRISP and IBM Cognos TM1

crispy

CRISP stands for Cross Industry Standard Process. It is a process model that describes commonly used approaches that experts use to tackle problems. Typically, you’ll hear of CRISP in the context of CRISP-DM, defining a process or methodology that breaks the process of data mining into six major phases.

A little more about the CRISP PHASES

The sequence of these phases is not strict, and moving back-and-forth between phases is typical and expected.  Certain phases may carry more weight than others and frequent dependencies may exist between some of the phases.  In addition, data mining by nature is cyclic – meaning that a data mining process usually continues after a solution has been deployed. Usually lessons learned during this processing will prompt more focused business questions and subsequent efforts will benefit from the experiences of previous ones.

But what about using a CRISP mindset when modeling Cognos TM1 applications? Well, you’ll find that this method of discovery not only “holds true” but works very well.

So let’s take a look at understanding the CRISP Phases and how they relate to TM1 modeling.

Business Understanding

The initial and most fundamental CRISP phase is designed to focus on understanding the objectives and requirements of the business. Obviously, this is also the first critical step in any modeling exercise and no different with TM1. This is usually referred to as “discovery” or (as the process continues) “requirements gathering and documenting”.

Data Intelligence - The Future of Big Data
The Future of Big Data

With some guidance, you can craft a data platform that is right for your organization’s needs and gets the most return from your data capital.

Get the Guide

Data Understanding

In this phase you perform an initial collection of all of the data perceived to be vital to the solution and then conduct activities designed to become familiar with that data, to identify data quality problems, to discover insights into the data, and detect interesting subsets and form hypotheses for hidden information – again, all exercises absolutely required if you hope to create a useful TM1 model.

Data Preparation

The data preparation phase covers all activities required to construct the final “data pool” (data that will be fed into and used by the model) from the initial raw data sources. Some data preparation tasks may be performed many times, and not in any prescribed order. These tasks may include table/view, record, and attribute selection as well as transformation and cleaning of the data. TM1 models are pretty flexible when it comes to acceptable data formats but every model will require some transformation of source data.

Modeling

In data mining, you would consider which modeling techniques to apply, calibrating parameters to optimal values, analyzing results and, as appropriate, stepping back into the data preparation phase as often as needed. The same approach works well when modeling for TM1. Your data is “organized” into dimensions (with hierarchies and attributes) and (1 or more) cubes, sample data is loaded and as with data mining, you may find that returning to the data understanding and preparation phases may be required.

Evaluation

At this stage, you have put together a model (or models) that you may feel comfortable with, and it is time to thoroughly evaluate the model, review the steps executed to construct the model and make certain it properly achieves the business objectives. In data mining, at the end of this phase, a decision on the use of the data mining results should be reached, with TM1 development of the actual application may begin.

Deployment

Creation of a model is generally not the end; even if the purpose of the modeling exercise was to simply increase knowledge of the data, the knowledge gained will need to be organized and presented in a way that the customer can absorb it.  In data mining, a deployment may be as simple as generating a report or as complex as implementing a repeatable mining process. Typically with TM1 the model “prototype” is handed off to a development team for programming, testing and delivery.

Concluding Thoughts

Simply put, CRISP is an industry proven best practice that will be of value to any TM1 modeler.

altt

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Jim Miller

Mr. Miller is an IBM certified and accomplished Senior Project Leader and Application/System Architect-Developer with over 30 years of extensive applications and system design and development experience. His current role is National FPM Practice Leader. His experience includes BI, Web architecture & design, systems analysis, GUI design and testing, Database modeling and systems analysis, design, and development of Client/Server, Web and Mainframe applications and systems utilizing: Applix TM1 (including TM1 rules, TI, TM1Web and Planning Manager), dynaSight - ArcPlan, ASP, DHTML, XML, IIS, MS Visual Basic and VBA, Visual Studio, PERL, Websuite, MS SQL Server, ORACLE, SYBASE SQL Server, etc. His Responsibilities have included all aspects of Windows and SQL solution development and design including: analysis; GUI (and Web site) design; data modeling; table, screen/form and script development; SQL (and remote stored procedures and triggers) development and testing; test preparation and management and training of programming staff. Other experience includes development of ETL infrastructure such as data transfer automation between mainframe (DB2, Lawson, Great Plains, etc.) systems and client/server SQL server and Web based applications and integration of enterprise applications and data sources. In addition, Mr. Miller has acted as Internet Applications Development Manager responsible for the design, development, QA and delivery of multiple Web Sites including online trading applications, warehouse process control and scheduling systems and administrative and control applications. Mr. Miller also was responsible for the design, development and administration of a Web based financial reporting system for a 450 million dollar organization, reporting directly to the CFO and his executive team. Mr. Miller has also been responsible for managing and directing multiple resources in various management roles including project and team leader, lead developer and applications development director. Specialties Include: Cognos/TM1 Design and Development, Cognos Planning, IBM SPSS and Modeler, OLAP, Visual Basic, SQL Server, Forecasting and Planning; International Application Development, Business Intelligence, Project Development. IBM Certified Developer - Cognos TM1 (perfect score 100% on exam) IBM Certified Business Analyst - Cognos TM1

More from this Author

Follow Us