Saturday, June 20, 2009

Creating a QA process from out of thin air

This week I had the opportunity to present the work I have been doing over the last year to a group of testers at a Testing Professionals Network (TPN) meeting. I enjoyed sharing the changes we have made and how far we have come in a short time.

Agile Acceptance testing with Fitnesse

A little more about what I've been up too...

About a year ago, I swapped jobs within my company from being a developer to the challenging and highly stressful job (in the beginning) of QA team lead. This meant I was tasked with building a QA team to test a high load, high availability adserver.

This all happened about a year after the company started when we suddenly found we had enough customers that we had to be very careful about what we released. However, the product had been created with such speed that we had no automatic system tests and it was incredibly hard to get any test to run successfully on our test environment. We had also just hired our first tester, who was struggling to get anything working. So it was a big problem and it needed solving, quickly.

My reponse was to start building an automatic testing system, starting with the area of biggest ROI (return on investment). I chose budgeting as I guessed not going over our promised budget was pretty important for our customers, was almost impossible to test by hand but was a reasonably simple test to autotmate.

From those small beginnings the suite slowly grew. As things started to take shape more and more developers could see the power of the system and were happy to contribute to it. Others required the boss to tell them that they must do it. Now 12ish months later, thanks partly to a re-write and thus the need to test everything, we probably have about 90% functionality coverage with our automated tests. Not only that, but we also have a team of developers who are enthusiastic about getting the tests running.

Having seen several other companies take a long time to develop automated tests I am truly amazed at our process. I believe our success is due to getting everyone involved in the testing process. Making the software developers run the tests really helped. That way they got to see how their system design affects the system testability.

So one of the main themes I had hoped to share in this presentation was that in order to build a good, reliable automated system test suite you generally need to improve the testability of the system under test. Improving testability generally benefitsboth testers and developers, as the tools that are developed also help during development and debugging of the system. For example, we created an installer, tools for checking the system status and ways to easily create test data.

So creating a quality test system quickly, I believe, requires the cooperation of both testers and developers. However, putting such a system in place is tricky because it generally requires a simultaneous culture and technology change.

An example of a culture change required is that developers often believe testers are solely responsible for testing. However, developers can make a huge difference on the testability of the system depending on how they design the system. So to get everyone focused on the end goal, tested and releasable software, it is important that both managers and developers understand that developers and testers share the responsibility for testing the system.

However, if a developer has not worked on a project with a good automated system test suite, then they may not be able to see how they can help the tester. Then, if they see tests running erratically they are more likely to question the value of such tests, than to question the way that they themselves have designed the system.

Thus the first culture change requires the developer to be able to picture how they can assist the testing process. But getting them to make the first changes, requires them to believe that the tests are worth building in the first place.

My approach to tackling this chicken and egg problem, is to be agile. Start with a working tests, then ask for small improvements that make the tests more reliable, or quick, or easy to write. Give some value, ask for an improvement, give value, ask again, give value, ask again...

So my second message to all tests leads out there is don't just stick your head down and suffer with unpredictable automated tests. Think about what you need changed about the system, and make sure your developers listen to you. It may seem like a huge effort for small gains in the beginning, but they all snow ball and save you time in the long run.

Please, take a look at the presentation and send me any comments you have. In particular if there are slides that don't make sense without the audio then write a comment and I'll explain them in more detail.

What small change can you make that will make your system more testable?


  1. Great post, thanks for sharing it! I totally agree that testability and team working together are the key factors for successful test automation. Testers and developers working together is probably the single most important factor as developers are sure to add all the needed testability if they are also responsible on getting tests pass.

  2. Here's a question I received via twitter:
    "I wanted to know how management responded to agile and change. thanks!"

    On this project I was really lucky to have a manager (also the CEO) who calls himself an agile evangelist. For us, the hardest job, was to get dedicated testers on the project.

    However, if you are trying to get support from your manager to move to agile, remember to focus on issues that affect the bottom line.

    For instance, if you need programming resource to solve some of the testability issues of your automated tests then add up how much time (=money) you are wasting re-running your tests or working out if the fails are genuine issues or timing problems. Talk about how much time (=lost opportunity) testing is taking after development is finished and how you could reduce it to just a few hours.

    If you need dedicated programming resources to do your job to the best of your ability, then it is in both your interest and your managers' to speak up about it.

    As for my company, we are now able to do more regression and functionality testing in 2 hours than we could have done in 2 weeks a year ago. What software manager's interest wouldn't be piped by that?

  3. Great article and presentation! In the company I work for we are in the midst of trying to change culture and introduce more automated testing from the development side.

    Our constraints are that we are a very small team and we have a huge backlog. So, to balance pleasing the product managers and moving forward with improving testability we often have one development iteration followed by a debt iteration which is largely dedicated to paying off testing and engineering debt.

    This way everyone sees progress and the system becomes more stable, reliable, testable, etc. Everyone says they want better quality but when push comes to shove they often ask for features over debt. Being able to satisfy the want for features and still move the quality forward is key for long term viability of the project.