If you’ve ever attempted to perform a performance test on a TM1 application, you know that there is not really an effective way to manually create sufficient load on a model. Getting “real users” to execute application operations – over and over again – is nearly impossible. Thankfully, an automated testing product can solve for this.
Automated testing refers to creating and running test “scripts” mechanically, concurrently, for as long as you need to. There are many tools today that can be used for automating your performance testing. One of the best is HP LoadRunner.
What is LoadRunner?
HP LoadRunner is a performance and test automation product from Hewlett-Packard that can be used for examining system behavior and performance, while generating actual load on your TM1 application.
HP’s description of the product is:
“HP LoadRunner is the industry-standard performance testing product for predicting system behavior and performance. Using limited hardware resources, LoadRunner emulates hundreds or thousands of concurrent users to put the application through the rigors of real-life user loads.”
LoadRunner is made up of 3 main components which are:
- Virtual User Generator or “VUGen” (where LoadRunner creates virtual users for your application using “scripts” that you create)
- The Controller (the “user script runner”)
- Analysis Engine (shows the detailed results)
LoadRunner Objectives
The Future of Big Data
With some guidance, you can craft a data platform that is right for your organization’s needs and gets the most return from your data capital.
The LoadRunner website goes on to outline typical performance testing objectives as:
- Emulate conditions of controlled load and maximum load on the application environment
- Measure application performance under load (response time, memory, throughput, etc.)
- Check where performance delays occur: network or client delays, CPU performance, I/O delays, database locking, or other issues at the server
- Monitor the network and server resources under load
Plan
A clearly defined routine (a plan) will make sure that your performance test is effective. Just because you have the ability to automate activities doesn’t guarantee that those activities are meaningful, based upon your objectives. It is advisable to clearly define and document even the most obvious information. A “best practice” recommendation is to develop a “performance testing questionnaire template” that can be filled out before the testing begins. The questionnaire will insure that all of the “ingredients” required are available at the appropriate time during the testing routine.
The following are examples of detail provided in a typical questionnaire:
Generalizations
- Define “the team” – Identify application owners and their contact information
- Are there any SLAs in place?
- What are the start and (expected) end dates for the testing?
- Name the environment to perform the test (QA? Production?) – including platform, servers, network, database, Web services, etc.
- What about network specifics – is the application accessible to outside users via the internet? Is there a specific bandwidth allocated to the application? Dedicated or burstable? Is the application hosted internally or externally? Is any load balancing occurring?
- Describe the details of all components of the infrastructure required to support this application (IP Address, Server Name, Software/version, OS (version/SP), CPU, Memory, Disk Sizes, etc.)
- From a client perspective, how does a user access the application? A web browser or other? (Citrix or thick client, etc.) If a browser is used define the supported browser – JAVA, ActiveX?
- Identify the conceptual goals of the performance tests – for example “to support expected user concurrency with acceptable response times.” And then provide quantitative goals – such as “simulate 75 concurrent users due to the nature of the business process which has the worldwide Affiliates (largest group of users) performing updates and reporting in a period of less than 24 hours”.
- Test requirements such as response time, load, memory, CPU utilization, etc. and provide any exiting performance metrics
Application, User and Use-Case Overviews
- What is the application to be tested? What is the purpose of the application (an overview)?(It is recommended that a high level architectural diagram be provided)
- Name the critical vs. non-critical transactions within the application under test
- User behavior, user locations, and application timeframe usability regarding the application under test
- List the business use cases that the application solves for and a brief description of each, then define the business process with in your application that will be included in the scope of the performance test. For example ‘Sales Planning contributors enter, review and adjust a current sales plan.”
- Define the categories of users for your application; identify which business processes are used by these users and state the percentage of the overall system activity this category of user represents.
- Estimate the average and peak number of simultaneous users for each business process (as seen in a normal day of business). Indicate which business processes may experience heavy throughput or are considered mission critical.
- For each business process (selected to be included in the performance test) document the specific user steps required to complete and determine whether the step requires input data.
- For each business process selected to be included in the performance test document number of concurrent users and percentage of total concurrent users that will be assigned to each process.
- Finally, for each remote location to be simulated, document number of concurrent users and percentage of total concurrent users that will be assigned to each process for tat site.
Preparation
Once the questionnaire is completed, the performance test team will use the information to create virtual or “Vusers” scripts within LoadRunner. Vusers emulate human users interacting with your application. A Vuser script contains the actions that each Vuser performs during scenario execution.
Test
Once the Vuser scripts are constructed, you can emulate load by instructing multiple Vusers to perform tasks simultaneously (load level is set by increasing or decreasing the number of Vusers that perform tasks at the same time).
Presentation
The LoadRunner analysis engine helps you “slice and dice” generated test data in many ways to determine which transactions passed or failed (your test objectives defined in the questionnaire), as well as some potential causes of failure. You can then generate custom reports to present to stakeholders.
Conclusion
Any enterprise level application will be performance tested before production delivery and periodically during its life. Typically you’ll find performance testing is conducted prior to delivery of the application but is not executed thereafter; is recommended that a schedule be established for re-execution of the performance tests, especially for applications that are evolving (new features being added) or experiencing a widening user base.