Skip to main content

Data & Intelligence

Leveraging Existing Cognos TM1 Components

Abstract

Eventually, you will (hopefully) have developed and delivered something in TM1 that is unique enough (and adds significant value) that you’ll want to leverage it in other models.

Leverage Considerations

Before leveraging your component, the following must be considered:

  • General Business Requirements (do your business requirements match the delivered components business specifications?)
  • Component Mechanics (can the model provide the component its minimal input parameters?)
  • Architectural Position (does the model adhere to the architectural principals required to support the component in its current state?)

General Business Requirements

Most importantly, you’ll need to determine if the “destination models” business specifications are defined as to adhere to or take advance of the specifications of the component you want to leverage.  Does the component actually do what you need? To answer this question, you need to examine the inter-workings or “basic mechanics” of the component. The First step? Examine the components documentation. No documentation? Then you shouldn’t be considering leveraging it (it’s not mature enough).

 Architectural Position

To effectively leverage your component, the destination model should adhere to basic industry architectural best practices. Although the basic logic of the component can be replicated as new TM1 objects (i.e. copy and paste), to produce a scalable, sustainable application, the following principals should be evaluated:

  • Architectural Purity
  • Data Absorption and Staging
  • Configuration of Assumptions and Adjustments
  • Results Consumption

 

Architectural Purity

Data Intelligence - The Future of Big Data
The Future of Big Data

With some guidance, you can craft a data platform that is right for your organization’s needs and gets the most return from your data capital.

Get the Guide

All Models are made up of 4 distinct components: Absorption, Configuration, Calculation and Consumption. These components should be kept separate from your component logic. Although the component can be implemented as part of any of these existing components, for sustainability and scalability, it is not recommended.

 If the component is to be implemented in an existing model, specific cubes should be utilized to keep the components logic encapsulated and decoupled from the existing model logic. For new models, the components logic can be “designed in” to the overall architecture.

 Data Absorption and Staging

In an architecturally pure state, all data “absorption” of a model would be kept separate from the actual models “business purpose”. This would be accomplished through the use of “data staging” cubes and TM1’s ETL tool (TurboIntegrator) and/or the TM1 Rules Components. If this is not the case (typically when implementing the components into a previously deployed model), it is recommended that the data that will be absorbed by the leveraged component be staged and transformed before being absorbed by the leveraged component.

Configuration of Assumptions and Adjustments

Typically all models will require certain assumptions to be configured (placed in context). Models also allow users to make adjustments (to support various scenarios). The leveraged component will require certain assumptions and adjustments to be made and this processing must be kept exterior to the actual components logic.

It is recommended that the assumption and adjustment method used by the destination model be leveraged to capture the input of assumption parameters and data adjustments required by the leveraged component.

Hopefully, the leveraged component will use a series of simple lookup cubes (LK) to store assumption configurations and track user adjustments, and it is recommended that these cubes be leveraged in the destination model.

 Results Consumption

The summarizing and compiling of the results of complex model calculations is considered a generic TM1 operation, not a key value-add to any particular model. Therefore the method and supporting logic to consume this information should be kept separate from the models “value-add” logic. Consuming the results of the leveraged component should also be kept separate from other components by utilizing supporting reporting or consumption cubes) or appropriate method).

 

Leverage On!

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Jim Miller

Mr. Miller is an IBM certified and accomplished Senior Project Leader and Application/System Architect-Developer with over 30 years of extensive applications and system design and development experience. His current role is National FPM Practice Leader. His experience includes BI, Web architecture & design, systems analysis, GUI design and testing, Database modeling and systems analysis, design, and development of Client/Server, Web and Mainframe applications and systems utilizing: Applix TM1 (including TM1 rules, TI, TM1Web and Planning Manager), dynaSight - ArcPlan, ASP, DHTML, XML, IIS, MS Visual Basic and VBA, Visual Studio, PERL, Websuite, MS SQL Server, ORACLE, SYBASE SQL Server, etc. His Responsibilities have included all aspects of Windows and SQL solution development and design including: analysis; GUI (and Web site) design; data modeling; table, screen/form and script development; SQL (and remote stored procedures and triggers) development and testing; test preparation and management and training of programming staff. Other experience includes development of ETL infrastructure such as data transfer automation between mainframe (DB2, Lawson, Great Plains, etc.) systems and client/server SQL server and Web based applications and integration of enterprise applications and data sources. In addition, Mr. Miller has acted as Internet Applications Development Manager responsible for the design, development, QA and delivery of multiple Web Sites including online trading applications, warehouse process control and scheduling systems and administrative and control applications. Mr. Miller also was responsible for the design, development and administration of a Web based financial reporting system for a 450 million dollar organization, reporting directly to the CFO and his executive team. Mr. Miller has also been responsible for managing and directing multiple resources in various management roles including project and team leader, lead developer and applications development director. Specialties Include: Cognos/TM1 Design and Development, Cognos Planning, IBM SPSS and Modeler, OLAP, Visual Basic, SQL Server, Forecasting and Planning; International Application Development, Business Intelligence, Project Development. IBM Certified Developer - Cognos TM1 (perfect score 100% on exam) IBM Certified Business Analyst - Cognos TM1

More from this Author

Follow Us
TwitterLinkedinFacebookYoutubeInstagram