Skip to main content

Data & Intelligence

Performance Testing Your Cognos TM1 Application

Performance testing a TM1 model begins with understanding performance testing in general.

Performance testing is an iterative process and the expected outcome (acceptable performance) will be based upon both the TM1 application and the environment it is deployed in.  The goal of performance testing is to identify response times, throughputs, and resource utilization of the overall application. (In general, response time is a user concern, throughput is a business concern, and resource utilization is a system concern).

The following image outlines the flow of the performance testing process:

perf

Performance & Performance Testing

The first step in performance testing is always to formulate a clear understanding of exactly what performance and performance testing really is. An applications performance plays a key role in its overall value and a performance testing strategy is critical in establishing and maintaining acceptable application performance levels.

Performance

  • Is Specific and Measurable (multiple performance KPI’s should therefore be established for the application and these KPI’s should reflect the effect of time zones, the time of day or day of the month, the type of user, etc.).
  • Includes intangibles (like the level of effort required to maintain the application being tested and the general usability of the application).
  • Will change over time due to system maturity and other factors (and therefore requires regular (re)profiling and review of all established performance KPI’s).
  • Is Comparable (to similar applications or older versions of itself and this is key to ensuring that performance expectations continue to be met over time).
  • Is Multidimensional (application performance can be measured in speed (in specific units) to complete specific application tasks, in overall application memory usage and consumption during normal operations, etc.).

Performance Testing

  • Is an iterative process (profile, evaluate, adjust, repeat!) and an ongoing process (to be re-measured over time),
  • requires controlled, consistent and meaningful profiling methods (the same explicit process must be used for each profiling session),
  • Is dependent upon established baselines (baselines are used to determine (the degree of) success or failure,
  • Must be recorded and kept in a consistent manner (always use a standardized collection method and save everything in the same, safe repository (preferably a searchable database) and
  • Test results should always be presented in a consistent, readable format (the format should summarize the data that has been collected to enable its analysis. The format should allow a knowledgeable reader to understand what was reported and easily draw a conclusion).

Now that we have aligned to what is really meant  by performance (and performance testing) we can proceed with establishing our baselines (which I will explain in my next post).

Until next time.

 

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Jim Miller

Mr. Miller is an IBM certified and accomplished Senior Project Leader and Application/System Architect-Developer with over 30 years of extensive applications and system design and development experience. His current role is National FPM Practice Leader. His experience includes BI, Web architecture & design, systems analysis, GUI design and testing, Database modeling and systems analysis, design, and development of Client/Server, Web and Mainframe applications and systems utilizing: Applix TM1 (including TM1 rules, TI, TM1Web and Planning Manager), dynaSight - ArcPlan, ASP, DHTML, XML, IIS, MS Visual Basic and VBA, Visual Studio, PERL, Websuite, MS SQL Server, ORACLE, SYBASE SQL Server, etc. His Responsibilities have included all aspects of Windows and SQL solution development and design including: analysis; GUI (and Web site) design; data modeling; table, screen/form and script development; SQL (and remote stored procedures and triggers) development and testing; test preparation and management and training of programming staff. Other experience includes development of ETL infrastructure such as data transfer automation between mainframe (DB2, Lawson, Great Plains, etc.) systems and client/server SQL server and Web based applications and integration of enterprise applications and data sources. In addition, Mr. Miller has acted as Internet Applications Development Manager responsible for the design, development, QA and delivery of multiple Web Sites including online trading applications, warehouse process control and scheduling systems and administrative and control applications. Mr. Miller also was responsible for the design, development and administration of a Web based financial reporting system for a 450 million dollar organization, reporting directly to the CFO and his executive team. Mr. Miller has also been responsible for managing and directing multiple resources in various management roles including project and team leader, lead developer and applications development director. Specialties Include: Cognos/TM1 Design and Development, Cognos Planning, IBM SPSS and Modeler, OLAP, Visual Basic, SQL Server, Forecasting and Planning; International Application Development, Business Intelligence, Project Development. IBM Certified Developer - Cognos TM1 (perfect score 100% on exam) IBM Certified Business Analyst - Cognos TM1

More from this Author

Follow Us