Autonomous Performance Tuning | Data & Analytics
Data & Analytics Blog

Autonomous Performance Tuning

I was recently asked by a former colleague of mine to help conduct an appraisal of a large scale Cognos TM1 development project that is planning on sourcing all kinds of data from a large Oracle RDMS. In addition to the “normal” review objectives (is the design appropriate? are best practices being followed? will the application scale?) was performance tuning. Over the course of the several evenings we spent together on this project we compared notes on performance tuning strategies. The following are some thoughts from my Oracle friend that make very good sense regardless on the technologies being used.

Types of Performance Tuning (Act or React)

There are really only 2 “types” of performance tuning – Proactive and Reactive.  Most (some?) TM1 applications go through some sort of performance testing and tuning before being deployed into a production environment.  But more often, testing and tuning only begins when application users begin to complain of “unacceptable” performance. Unfortunately, by then, it may be too late to utilize some of the most effective application tuning strategies.

By far the most effective approach to tuning any application (including a TM1 application) is to take the pro-active approach. (As I’ve mentioned in some of my earlier blog posts) it is important to establish performance targets and set realistic performance expectations during the design and development stages of the project, NOT after deployment and things “just don’t seem right”. My friend pointed out that during design and development, the application designers can (based upon business requirements) determine which combination of system resources and available Oracle features will best meet these needs. A similar approach should be used when modeling an organizations data for TM1.

Prioritized Tuning

Of course, as they all do, the project came to an end and it was time to move on. Before he left, my friend was kind enough to provide some “additional reading” for my next plane ride. The information was “spot on”. Of course I won’t recite all if it here (that would be way too “ochocinco”), but I’d like to comment on some specific points:

Oracle breaks tuning into “recommended steps for optimal results” and these steps are prioritized in order of diminishing returns. These steps are:

Step 1: Tune the Business Rules

Step 2: Tune the Data Design

Step 3: Tune the Application Design

Step 4: Tune the Logical Structure of the Database

Step 5: Tune Database Operations

Step 6: Tune the Access Paths

Step 7: Tune Memory Allocation

Step 8: Tune I/O and Physical Structure

Step 9: Tune Resource Contention

Step 10: Tune the Underlying Platform(s)


Rather than go through each step, I’d like to call out these few.


Step 1: Tune the Business Rules

Here is my first “enlightenment” – as any project begins, business requirements and rules are gathered, understood and documented. A practice that is usually overlooked is the fact that for optimal performance of any application, you may have to adapt those business rules. In other words, when gathering requirements, do NOT document an implementation, document the function that must be performed. (If business executives effectively distill business functions or requirements from the implementation, then application the designer (you!) will have more freedom when selecting an appropriate implementation (or TM1 model design).


Step 2: Tune (Transform) the Data Design

In the data design phase, you must determine what data is needed by your application. You need to consider what relations are important, and what their attributes are. Finally you need to structure the information to best meet application performance goals.


Most organizations retain data in a normalized state (to eliminate data redundancy and improve accuracy). After the data is normalized; however, you may need to denormalize it. You might, for example, also decide to retain frequently used summary values (don’t force your application to recalculate the total price of all the lines in a given order each time it is accessed) and so on…


The point is, it is very appropriate to “pre-transform” data into a form that your TM1 model can easily and effectively absorb and use. Don’t “design into” the model logic to transform data – let the model focus on it’s “value add”.

Step 3: Tune the Application Design

Translate the goals of the business into an effective and reasonable system design. What comes to mind here is the organization that wants to forecast at the SKU level using TM1 (for all 25 thousand SKU’s!) or “demands” real-time (as opposed to near-real-time) reporting during a planning cycle. These are examples of business “goals” that have the ability to produce application designs that are destined for failure.



I believe that the project we assessed is on target and will be a success. I also enjoyed working with my friend and comparing “cross technology” notes. Lesson learned –best practices are almost always technology autonomous.


Subscribe to the Data & Analytics Weekly Digest

* indicates required

Leave a Reply

Your email address will not be published. Required fields are marked *

Perficient Data & Analytics

Trends, best practices, and technical perspectives from data and analytics experts. Readers will gain valuable insights and strategies in harnessing the power of data to drive actionable intelligence for a digital transformation journey.