23.03.2013 Views

Agile Performance Testing - Testing Experience

Agile Performance Testing - Testing Experience

Agile Performance Testing - Testing Experience

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

To accomplish performance testing, we need to engage in a careful,<br />

controlled process of measurement and analysis. It is critical<br />

to understand that the aim of performance testing is not to find<br />

bugs, but rather to identify bottlenecks. Ideally, the software under<br />

test should be stable enough so that this process can proceed<br />

smoothly.<br />

Most large corporations have performance testing and engineering<br />

groups today, and performance testing is a mandatory step<br />

to get the system into production. However, in most cases, it is<br />

pre-production performance validation only. Hence there seems<br />

to be a need for an efficient way to do performance testing. Most<br />

often it is believed that agile introduces chaos and is therefore not<br />

suitable for performance testing, which is a careful, controlled<br />

process. This article unfolds the fact that doing performance testing<br />

in a more agile way may increase its efficiency significantly.<br />

The teams practicing XP always had working software due to the<br />

continuous integration practices. They also have a suite of automated<br />

tests. XP<br />

coupled with agile<br />

planning and automated<br />

performance<br />

regression tests is a<br />

big step towards<br />

agile performance<br />

testing.<br />

Let’s look at how some of the agile principles like test early, continuous<br />

improvement and collaboration can be applied to performance<br />

testing.<br />

Test Early – Early <strong>Performance</strong> <strong>Testing</strong><br />

The main obstacle here is that many systems are generally colossal,<br />

if there are parts - they don‘t make much sense separately.<br />

However, there may be significant advantages to test-drive<br />

development. If you can decompose the system into components<br />

in such way that we can actually test them separately for performance,<br />

then all that we need to do is to fix integration issues, if<br />

any, when we put the system together. Another problem is that<br />

large corporations use a lot of third-party products, where the<br />

system usually seems to be a „black box“ and cannot be easily understood,<br />

which makes it more difficult to test effectively.<br />

Though it is well understood that it is better to build performance<br />

right up front, i.e. during the design stage, and pursue performance-related<br />

activities throughout the whole software lifecycle,<br />

unsurprisingly quite often performance testing happens just before<br />

going live within a very short timeframe allocated for it. Still<br />

<strong>Agile</strong> <strong>Performance</strong> <strong>Testing</strong><br />

© iStockphoto.com/AYakovlev<br />

by Sowmya Karunakaran<br />

approaching performance testing formally, with a rigid, step-bystep<br />

approach and narrow specialization, often leads to missing<br />

performance problems altogether, or to a prolonged agony of<br />

performance troubleshooting. With little extra effort, by making<br />

the process more agile, efficiency of performance testing increases<br />

significantly – and these extra efforts usually pay off multifold<br />

even before the end of performance testing.<br />

Iterative <strong>Performance</strong> testing<br />

This is crucial for success in performance testing, and an agile<br />

environment is surely conducive for this. For example: A tester<br />

runs a test and gets a lot of information about the system. To be<br />

efficient he needs to analyze the feedback that he got from the<br />

system, make modifications to the system and adjust the plans if<br />

necessary. The cycle below, if done in an iterative fashion until the<br />

system under test achieves the expected levels of performance,<br />

will yield better results and also serve as a baseline to perform<br />

regression tests.<br />

<strong>Performance</strong> Optimization (Continuous Improvement)<br />

<strong>Performance</strong> testing becomes meaningless if it is not complemented<br />

with performance optimization. Usually when the performance<br />

of a system needs to be improved, there are one or two<br />

methods where all the time is spent. <strong>Agile</strong> optimization works by<br />

garnishing existing unit tests with a timing element.<br />

The process involves six easy steps:<br />

1. Profile the application<br />

2. Identify the bottlenecks<br />

3. Find out the average runtime of the methods and create a<br />

baseline.<br />

4. Embellish the test with the desired runtime.<br />

5. Refactor / optimize the method.<br />

6. Repeat until finished.<br />

However, care must be taken to ensure that the data used for tes-<br />

64 The Magazine for Professional Testers www.testingexperience.com

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!