If you’re a software tester, sooner or later someone will ask you to be involved in performance testing. If you don’t know much about performance testing, it can be pretty daunting. Even if you do know a lot about performance testing, it can still be pretty daunting, especially if there are strict time and budget restrictions…and I’m inclined to guess that there usually are. A little while ago, Anne-Marie Charrett invited me to contribute to her article “Ways to approach application performance testing on a tight budget”, which is definitely worth reading for anyone involved in performance testing.
My advice here comes in the form of a story, which I think we can all learn from. Once upon a time, there was a big web app project. The project was almost complete and was due within just a couple of weeks. Around this time, a tester called Mish joined the project and the other testers told her that she was to take over the task of performance testing from them. Well! Mish had never done performance testing before, and told them so. The other tester told her he hadn’t done it before either, but they were on a tight budget and they just had to make do with what they had. The other tester explained to Mish that he had been using JMeter to record benchmarks against an older version of the application, and he was going to compare the times to the new version of the application.
Immediately Mish realised it was never going to work. The new version of the application was running on a complicated 6-box cluster environment, whereas the old version of the application was running on an old abandoned workstation that someone had found under a desk somewhere, with a big sign on it saying “DO NOT TOUCH”. Mish tried to point this out, but was told it was the best they could do and she would just have to deal with it. So Mish abandoned benchmarking and instead just measured the loading times of some operations in the application, presented the results and hoped for the best.
Management said that the results were fine and that they were ready to deploy. So then they added the production data to the application’s database. Hundreds of thousands of rows of production data. Well, didn’t that slow down the performance considerably! The project was delayed another month as a result.
What can we learn from this story?
- Always perform production testing on an environment that is similar to the production environment in as many ways as possible. This includes the hardware, the software, the other software running on the server, the test data, and so much more!
- Performance testing, if not done correctly, can lead to very inaccurate results.
- There is MUCH more to performance testing than just page load times. Research this skill thoroughly before just trying to wing it.
- Performance problems are often not trivial to fix. Do not leave performance testing to the end of the project lifecycle. Start early!
Another way not to lose your application is to do post-production monitoring of web-resource that you’ve launched. Even if you do not have your own test-lab for generating enough virtual users, you can easily use public cloud services, such as Jmeter-based cloud http://blazemeter.com.
This will let you track situation during production usage.