Data & Intelligence

IBM Cognos TM1 – Version Release Migration Testing

Here are several general considerations for testing functional areas of a new release of IBM Cognos TM1; (keep in mind that it is not intended to be an exhaustive or “end to end” testing plan. It also does not intend to offer remediation options should testing uncover issues).

Functional areas of emphasis covered are:

  • Server Statistics
  • Messaging and Logging
  • Performance (overall)
  • Client Tools
  • Web Interface
  • Business Application Logic
  • (Generic) Processing
  • Contention
  • Security

Preparation

It is advisable that the new release of TM1 Server be installed in an environment similar to the existing TM1 environment so that environmental statistics (such as memory consumption) and general (overall) performance can be observed and evaluated. It is also advised that the client tools installed (and to be tested) do not coexist on the same user machine with client tools from a previous release.

As in all testing, appropriate test plan templates should be distributed to and used by each of the testers. Tests cannot be considered valid without supporting documentation. A test plan template will include:

  • Tester Name
  • Test Name
  • Test Description (tests goals, objectives, background and assumptions)
  • Date and Time of the test
  • Acceptance Criteria (criteria to measure quality, objective of the testing process)
  • Test Preparation steps
  • Expected Results
  • Actual Results
  • Pass/Fail Status
  • Section for additional comments or notes

Server Statistics

After installation of an existing TM1 application in the new release of TM1 server, you should review and compare the memory statistics for the server and for each TM1 cube. In general, you should not see any significant change in these statistics compared to the previous release statistics.

The following should be reviewed and compared:

  • StatsForServer –
    • Memory Used
    • Memory In Garbage
  • StatsByCube –
    • Memory Used for Views
    • Number of Stored Views
    • Number of Stored Calculated Cells
    • Number of Populated String Cells
    • Number of Populated Numeric Cells
    • Number of Fed Cells
    • Memory Used for Calculations
    • Memory Used for Feeders
    • Memory Used for Input Data
    • Total Memory Used

    Message and Logging

    During testing, machine event logs should be captured and reviewed. In addition, all TM1 Server logs should be reviewed both at server startup and periodically throughout the testing process:

    • Admin Server Log
    • Transactions Log
    • Server Message Log
    • Audit Log

    Performance (Overall)

    Because changes may have been made to the internals of TM1 Server in a new software release, it is important to identify any possible change to overall performance. General server and application processes should be timed and compared (to the existing TM1 environment baselines).

    This might include:

    • TM1 server start up time
    • Cube view open and recalculate times
    • SaveDataAll time
    • Sampling of TurboIntegrator script execution times
    • Etc.

    Should performances deviate from “previously established acceptable levels” it may be necessary to adjust the configuration parameters of the TM1 server or in extreme situations, modify portions of the TM1 application architecture. In most cases, you should expect minimal negative performance impact.

Client Tools

All of the available client tools (TM1 Client, Architect, Perspectives, etc.) should be exercised. You should perform at least a cursory test of each of the features and functionalities of the tools. For example for TM1 Architect:

Data Intelligence - The Future of Big Data
The Future of Big Data

With some guidance, you can craft a data platform that is right for your organization’s needs and gets the most return from your data capital.

Get the Guide

Sample Test Script:

  • Open server explorer and successfully log into the TM1 server (using the same authentication method used in the existing version on TM1)
  • Open each cube view (using cube viewer)
  • Perform typical cube viewer tasks (move/stack dimensions, select from drop-down/pick lists, recalculate views, drill-down, suppress zeros, update/save new and existing views, open and update dimension and subset editors, etc.
  • Exercise dimensional hierarchies – collapse and expand, validate consolidation correctness (do the numbers rollup correctly?)
  • Open, update and save a rules file using the rules editor
  • Open, update and save multiple TurboIntegrator processes
  • Create and save a new TurboIntegrator process
  • Update an existing Chore
  • Etc.

Web Interface

The TM1 Web interface needs to be tested. Testers should access each and every TM1 object that is available in the current TM1 environment. Make sure to test both Websheets and cube views.

  • Does it render correctly?
  • Do all objects (buttons, pick lists, etc.) operate as expected?
  • Does data refresh?
  • Etc.

 Business Application Logic

All business logic implemented in the TM1 application should be reviewed and tested. Optimally, the original system acceptance test plans should be used to drive the regression testing of the application (running on the new TM1 release). Allow sufficient time for this exercise.

Generic Processing

All “generic application processing” should be tested. Since you should have completed testing the client tools and Web Interface, the application processing to be reviewed in this phase of the testing should be limited to the TurboIntegrator processing within the application. Key tasks might include:

  • View creation, update and destroy (including subsets)
  • Data loading from files and external data sources
  • View zero outs
  • ODBC connectivity
  • Typical hierarchy maintenance
  • SaveDataAll processing
  • ExecuteProcess

Contention

Whenever possible, testing should include a “reasonable sampling” of the actual TM1 user community. You cannot realistically test the new version of TM1 if all testers are TM1 Administrator access users. The “testing community” should parallel the existing community in user “types” and “concurrent user count”.

All users will be categorized as being a member of 1 of three basic groups. These groups are “power users”, “planners” and “reporters” and each will have a different “weight” when calculating the expected total concurrent users. Best practice testing recommendations are based on the assumption of peak concurrent user counts for each user “type”.

Testers should create scenarios that would most often occur within the existing version of TM1; i.e. have testers perform planning work while others create normal reporting queries and then observe the results using the TM1 top utility (TM1 Top is a utility that dynamically monitors the threads and processes running in an instance of a single TM1 server. TM1 Top is a stand-alone utility that runs within a console (command) window on a Microsoft Windows computer). If chores are routinely executed during these times, those chores should be executed during the testing sessions as well. Any change in locking behaviors should be recorded and investigated.

Security

It is usually difficult (and against best practice) to leverage the actual clients and group relationships that exist in the current TM1 environment, therefore time will need to be taken to establish and maintain a sufficient “test community” (test versions of client IDs and their respective group relationships). This can be a time consuming task, but worth the effort. Note: An experienced development or support team should be able to create a script for test user community generation and maintenance (for each TM1 application being supported).

About the Author

Mr. Miller is an IBM certified and accomplished Senior Project Leader and Application/System Architect-Developer with over 30 years of extensive applications and system design and development experience. His current role is National FPM Practice Leader. His experience includes BI, Web architecture & design, systems analysis, GUI design and testing, Database modeling and systems analysis, design, and development of Client/Server, Web and Mainframe applications and systems utilizing: Applix TM1 (including TM1 rules, TI, TM1Web and Planning Manager), dynaSight - ArcPlan, ASP, DHTML, XML, IIS, MS Visual Basic and VBA, Visual Studio, PERL, Websuite, MS SQL Server, ORACLE, SYBASE SQL Server, etc. His Responsibilities have included all aspects of Windows and SQL solution development and design including: analysis; GUI (and Web site) design; data modeling; table, screen/form and script development; SQL (and remote stored procedures and triggers) development and testing; test preparation and management and training of programming staff. Other experience includes development of ETL infrastructure such as data transfer automation between mainframe (DB2, Lawson, Great Plains, etc.) systems and client/server SQL server and Web based applications and integration of enterprise applications and data sources. In addition, Mr. Miller has acted as Internet Applications Development Manager responsible for the design, development, QA and delivery of multiple Web Sites including online trading applications, warehouse process control and scheduling systems and administrative and control applications. Mr. Miller also was responsible for the design, development and administration of a Web based financial reporting system for a 450 million dollar organization, reporting directly to the CFO and his executive team. Mr. Miller has also been responsible for managing and directing multiple resources in various management roles including project and team leader, lead developer and applications development director. Specialties Include: Cognos/TM1 Design and Development, Cognos Planning, IBM SPSS and Modeler, OLAP, Visual Basic, SQL Server, Forecasting and Planning; International Application Development, Business Intelligence, Project Development. IBM Certified Developer - Cognos TM1 (perfect score 100% on exam) IBM Certified Business Analyst - Cognos TM1

More from this Author

Subscribe to the Weekly Blog Digest:

Sign Up