One of the main reasons we undertake performance testing is to mitigate risk. When defining performance objectives, consideration should be given to the risks to be covered. Given that risk usually needs to be balanced against various constraints, like time and cost, it will be unlikely that all risks can be covered thus it is important to focus on the key risks i.e. potential customer risks such as the following.
- System is too slow
- System may break at peak times
- System may break with ad hoc pressure i.e. a sudden spike in traffic
- System may slow down over several months due to high data volumes
- System behaviour may degrade over time
Performance Test Objectives
The first step is to establish and detail the objectives of the performance test.
Note- performance objectives are normally set out to provide answers to the risks.
Whilst setting objectives it may be useful to use the SMART criteria (Specific, Measurable, Achievable, Realistic and Time-bound)
- Specific – Clearly identify and define what the objective is.
- Measurable – Is the objective quantifiable? (response time is an example of a measurement that can be used)
- Achievable – Make sure the objective can be met.
- Realistic – Ensure the objective is realistic. (a practical scenario which is true to life)
- Time-related – Specify the time for the objective i.e. “peak hour” of system load.
See the following examples,
- Prove that system performance meets the target response times as detailed in the Service Level Agreement (SLA) for example
i.Normal – 100 concurrent user loads over the period of an hour.
ii.Peak – 200 concurrent user loads over the period of an hour.
- Prove that the system performance can handle a sudden surge in traffic of 200 concurrent users within a timeframe of two minutes while the target response times are still met.
- Prove that the system performance remains stable under extended periods at high load, 200 concurrent users, and that target response times are still met over a period of 48 hours.
Once your performance objectives are defined the next step is to define the Performance Test Requirements i.e. detail the testing required to prove and meet your objectives.
Performance Test Requirements
Performance test requirements should never be an afterthought. They should be defined early in the application lifecycle model as high-priority requirements that are testable.
The following are examples of performance test requirements to meet the objectives outlined above.
- Execute a ‘Load’ test to determine whether the system will perform with acceptable response times, at loads of up to 200 concurrent users, running a scenario of typical business processes.
“Load Testing” is performed to evaluate the performance of the system. The load is normally the expected concurrent number of users on the application performing a specific number of transactions, within a set duration. The load test will normally determine the response time, throughput, CPU utilisation, and error rate during the period of the test. Monitoring elements such as CPU, Memory, Network, Database, Application etc. when running the test can assist with the identification of bottlenecks.
An example scenario could be, a new banking web application is expected to have a peak load of 200 concurrent users. A load test could be configured with 200 virtual users and run for a period of an hour. Once the test has run the results can be analysed to determine how the application will behave at its expected peak load.
- Execute a ‘Spike Test’ to evaluate the behaviour of the system when suddenly there is a sharp increase in the number of users. The goal is to determine whether:
- performance will suffer
- the system will fail
- if the system can cope with dramatic changes in load
For example, on a sports betting web application, just before a football match kicks off, the number of users can increase suddenly in a truly short time frame to place a bet on the match.
- Execute a ‘Soak Test’ to determine if the system can sustain a continuous expected load over a significant period.
During a soak test, memory utilisation is monitored to detect any potential memory leaks. Performance degradation can also be checked to ensure that the throughput/response times after a long period of sustained activity have not been affected.
Keeping the Sports betting application as our basis for example – where the application is used continuously by different users placing bets say in the run-up period to the start of the Grand National horse race. A test can be run for the duration of between 24 to 48 hours with a predicted load. The memory utilisation can be monitored during the test.
In conclusion, prior to the performance test execution, it is paramount to detail the objectives and requirements to facilitate the testing.
To see how SQA Consulting may assist your company in performance testing your applications, please contact us.