Skip to main content

Data & Intelligence

Autonomous Performance Tuning

I was recently asked by a former colleague of mine to help conduct an appraisal of a large scale Cognos TM1 development project that is planning on sourcing all kinds of data from a large Oracle RDMS. In addition to the “normal” review objectives (is the design appropriate? are best practices being followed? will the application scale?) was performance tuning. Over the course of the several evenings we spent together on this project we compared notes on performance tuning strategies. The following are some thoughts from my Oracle friend that make very good sense regardless on the technologies being used.

Types of Performance Tuning (Act or React)

There are really only 2 “types” of performance tuning – Proactive and Reactive.  Most (some?) TM1 applications go through some sort of performance testing and tuning before being deployed into a production environment.  But more often, testing and tuning only begins when application users begin to complain of “unacceptable” performance. Unfortunately, by then, it may be too late to utilize some of the most effective application tuning strategies.

By far the most effective approach to tuning any application (including a TM1 application) is to take the pro-active approach. (As I’ve mentioned in some of my earlier blog posts) it is important to establish performance targets and set realistic performance expectations during the design and development stages of the project, NOT after deployment and things “just don’t seem right”. My friend pointed out that during design and development, the application designers can (based upon business requirements) determine which combination of system resources and available Oracle features will best meet these needs. A similar approach should be used when modeling an organizations data for TM1.

Prioritized Tuning

Of course, as they all do, the project came to an end and it was time to move on. Before he left, my friend was kind enough to provide some “additional reading” for my next plane ride. The information was “spot on”. Of course I won’t recite all if it here (that would be way too “ochocinco”), but I’d like to comment on some specific points:

Oracle breaks tuning into “recommended steps for optimal results” and these steps are prioritized in order of diminishing returns. These steps are:

Step 1: Tune the Business Rules

Step 2: Tune the Data Design

Step 3: Tune the Application Design

Step 4: Tune the Logical Structure of the Database

Step 5: Tune Database Operations

Step 6: Tune the Access Paths

Step 7: Tune Memory Allocation

Data Intelligence - The Future of Big Data
The Future of Big Data

With some guidance, you can craft a data platform that is right for your organization’s needs and gets the most return from your data capital.

Get the Guide

Step 8: Tune I/O and Physical Structure

Step 9: Tune Resource Contention

Step 10: Tune the Underlying Platform(s)

 

Rather than go through each step, I’d like to call out these few.

 

Step 1: Tune the Business Rules

Here is my first “enlightenment” – as any project begins, business requirements and rules are gathered, understood and documented. A practice that is usually overlooked is the fact that for optimal performance of any application, you may have to adapt those business rules. In other words, when gathering requirements, do NOT document an implementation, document the function that must be performed. (If business executives effectively distill business functions or requirements from the implementation, then application the designer (you!) will have more freedom when selecting an appropriate implementation (or TM1 model design).

 

Step 2: Tune (Transform) the Data Design

In the data design phase, you must determine what data is needed by your application. You need to consider what relations are important, and what their attributes are. Finally you need to structure the information to best meet application performance goals.

 

Most organizations retain data in a normalized state (to eliminate data redundancy and improve accuracy). After the data is normalized; however, you may need to denormalize it. You might, for example, also decide to retain frequently used summary values (don’t force your application to recalculate the total price of all the lines in a given order each time it is accessed) and so on…

 

The point is, it is very appropriate to “pre-transform” data into a form that your TM1 model can easily and effectively absorb and use. Don’t “design into” the model logic to transform data – let the model focus on it’s “value add”.

Step 3: Tune the Application Design

Translate the goals of the business into an effective and reasonable system design. What comes to mind here is the organization that wants to forecast at the SKU level using TM1 (for all 25 thousand SKU’s!) or “demands” real-time (as opposed to near-real-time) reporting during a planning cycle. These are examples of business “goals” that have the ability to produce application designs that are destined for failure.

 

Conclusion

I believe that the project we assessed is on target and will be a success. I also enjoyed working with my friend and comparing “cross technology” notes. Lesson learned –best practices are almost always technology autonomous.

jm

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Jim Miller

Mr. Miller is an IBM certified and accomplished Senior Project Leader and Application/System Architect-Developer with over 30 years of extensive applications and system design and development experience. His current role is National FPM Practice Leader. His experience includes BI, Web architecture & design, systems analysis, GUI design and testing, Database modeling and systems analysis, design, and development of Client/Server, Web and Mainframe applications and systems utilizing: Applix TM1 (including TM1 rules, TI, TM1Web and Planning Manager), dynaSight - ArcPlan, ASP, DHTML, XML, IIS, MS Visual Basic and VBA, Visual Studio, PERL, Websuite, MS SQL Server, ORACLE, SYBASE SQL Server, etc. His Responsibilities have included all aspects of Windows and SQL solution development and design including: analysis; GUI (and Web site) design; data modeling; table, screen/form and script development; SQL (and remote stored procedures and triggers) development and testing; test preparation and management and training of programming staff. Other experience includes development of ETL infrastructure such as data transfer automation between mainframe (DB2, Lawson, Great Plains, etc.) systems and client/server SQL server and Web based applications and integration of enterprise applications and data sources. In addition, Mr. Miller has acted as Internet Applications Development Manager responsible for the design, development, QA and delivery of multiple Web Sites including online trading applications, warehouse process control and scheduling systems and administrative and control applications. Mr. Miller also was responsible for the design, development and administration of a Web based financial reporting system for a 450 million dollar organization, reporting directly to the CFO and his executive team. Mr. Miller has also been responsible for managing and directing multiple resources in various management roles including project and team leader, lead developer and applications development director. Specialties Include: Cognos/TM1 Design and Development, Cognos Planning, IBM SPSS and Modeler, OLAP, Visual Basic, SQL Server, Forecasting and Planning; International Application Development, Business Intelligence, Project Development. IBM Certified Developer - Cognos TM1 (perfect score 100% on exam) IBM Certified Business Analyst - Cognos TM1

More from this Author

Follow Us
TwitterLinkedinFacebookYoutubeInstagram