Skip to main content

Data & Intelligence

Modifying Cognos TM1 Configuration Parameters

cfg

The Tm1s.cfg Server Configuration File

The Tm1s.cfg file is an ASCII file that specifies environment information for a TM1 server. A default Tm1s.cfg file is created in the TM1 server data directory when you install a copy of the TM1 server. You can edit the Tm1s.cfg file to reflect the environment of the associated remote server.

General Best Practice

The following are best practice suggestions for modifying parameters in the TM1 configuration file:

What are you solving for?

The first step is to determine exactly what the objective is for modifying the default TM1 configuration. There are many parameters and settings available but they each are designed to address specific environmental or model conditions. Without knowing what you are “solving for” it will be almost impossible to determine a modification and testing strategy.

Know the Hardware

Many of the configuration parameters available will directly affect or be affected by the hardware and environment. Before making changes to the default settings of Cognos TM1, it is important to verify that your server hardware is correctly sized and configured for the (TM1) version, size of the model, number of concurrent users as well as typical user activity.

Environment vs. Environment

Data Intelligence - The Future of Big Data
The Future of Big Data

With some guidance, you can craft a data platform that is right for your organization’s needs and gets the most return from your data capital.

Get the Guide

It goes without saying that any testing – and especially modification of configuration settings – should be conducted first in a non-production environment. Additionally, you should use an environment that is “as close as possible” to the target environment to be sure that any advantages you might see during testing will be relevant in the production environment.

Baseline

To determine the outcome of your testing you need to be able measure the results. To do this you must first establish a baseline. You can use formal tools to do this, or just manually record observations (i.e. “when I open the QuarterlyROI.xls it takes 14 seconds”, etc.). But remember, the more exhaustive the documented baselines, the better chances you’ll have at achieving an acceptable outcome.

Good Enough

Once a baseline is established, the next step will be to agree on what is “acceptable” or “good enough”. For example, if originally it took users 21 seconds to perform a refresh or recalculate of a report and after the test it takes 19 is that acceptable? Additionally, if it now takes 4 seconds to perform that refresh but it now takes double the time for updates to occur, is that acceptable?

Take your Time

It is very important to make changes to the configuration file one at a time and in an ordered manner. This is because the configuration settings of some parameters may affect the settings of another.  Additionally, some settings may have no real net affect and therefore should not be modified or added to the CFG file. If a configuration setting does not have a documented positive affect on your TM1 application then remove it from the file or do not change the default value (This will keep future maintenance and support easier).

The procedure should be:

  • Verify the baselines
  • Shut down TM1
  • Make a single CFG modification
  • Start-up TM1 (observe effect on startup compared to baseline)
  • Run tests (and compare results to baselines)
  • Determine “nailed or failed?”
  • Repeat

Observe Appropriately

Make sure that the effect of a configuration modification is observed both at TM1 server startup time as well as over an acceptable period of time. Some parameter settings have different effects on startup versus real-time (when TM1 has been is up and running).

It is equally important to observe results after the initial startup of TM1 as well periodically over time. The behavior and performance of TM1 changes over time (as memory is consumed and released) and therefore some initial positive influences of a configuration setting modification may fade over the course of normal application uptime.

Be Typical

To insure you are observing the true anticipated effect of a configuration modification it is critical to simulate typical activity during the testing. That includes:

  • Concurrent user counts (if the typical concurrent user count is 150, a test of 2 concurrent users is not valid)
  • User types (if 85 percent of your users are “heavy planners” performing numerous write-backs and your testing is with only “read-only reporters” – the test is invalid)
  • User activity (make sure that your test mimics the activity that is occurring most often when issues are reported. Is it during a planning session? During data loading?)

Consider Using Tools and Getting Help

Whenever possible, the effects of configuration modifications should be evaluated using formal performance monitoring tools, rather than simply “watching your wrist watch”. Additionally, it is advisable to have experienced support involved in the testing to decrease the risk of missing possible collateral effects of a change that seems to help.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Jim Miller

Mr. Miller is an IBM certified and accomplished Senior Project Leader and Application/System Architect-Developer with over 30 years of extensive applications and system design and development experience. His current role is National FPM Practice Leader. His experience includes BI, Web architecture & design, systems analysis, GUI design and testing, Database modeling and systems analysis, design, and development of Client/Server, Web and Mainframe applications and systems utilizing: Applix TM1 (including TM1 rules, TI, TM1Web and Planning Manager), dynaSight - ArcPlan, ASP, DHTML, XML, IIS, MS Visual Basic and VBA, Visual Studio, PERL, Websuite, MS SQL Server, ORACLE, SYBASE SQL Server, etc. His Responsibilities have included all aspects of Windows and SQL solution development and design including: analysis; GUI (and Web site) design; data modeling; table, screen/form and script development; SQL (and remote stored procedures and triggers) development and testing; test preparation and management and training of programming staff. Other experience includes development of ETL infrastructure such as data transfer automation between mainframe (DB2, Lawson, Great Plains, etc.) systems and client/server SQL server and Web based applications and integration of enterprise applications and data sources. In addition, Mr. Miller has acted as Internet Applications Development Manager responsible for the design, development, QA and delivery of multiple Web Sites including online trading applications, warehouse process control and scheduling systems and administrative and control applications. Mr. Miller also was responsible for the design, development and administration of a Web based financial reporting system for a 450 million dollar organization, reporting directly to the CFO and his executive team. Mr. Miller has also been responsible for managing and directing multiple resources in various management roles including project and team leader, lead developer and applications development director. Specialties Include: Cognos/TM1 Design and Development, Cognos Planning, IBM SPSS and Modeler, OLAP, Visual Basic, SQL Server, Forecasting and Planning; International Application Development, Business Intelligence, Project Development. IBM Certified Developer - Cognos TM1 (perfect score 100% on exam) IBM Certified Business Analyst - Cognos TM1

More from this Author

Follow Us
TwitterLinkedinFacebookYoutubeInstagram