The purpose of testing is to help us to get our work done quickly, however, in many organisations, testing is seen as something that slows work down.
There is a common misconception that quality and delivery speed are opposing forces that must be traded off against each other, with this mind-set leading to the idea that automation can speed up the delivery process by making the act of testing a system go faster. These misconceptions can easily lead to expensive, failed test automation initiatives.
Quality is an enabler of delivery speed, with the goal of automated testing being to help teams focus on keeping the quality of their system high, through fast feedback. When combined with a good team culture and a discipline that prioritises quality, automated tooling can help to find quality issues fast, meaning the team can respond and fix them quickly.
In turn, this keeps the system in a state where changes can be made quickly, easily and confidently, proving that faster delivery speed is a side effect of focusing on quality and automated test tooling is an aid to keeping quality at the forefront of the team’s mind.
Shortening the feedback loop
Agile processes encourage teams to integrate testing with implementation, in order to shorten the feedback loop. Testing takes place continuously, with ongoing changes being made by testers and developers working closely together, combined with automated testing.
The most useful goal for test automation isn’t to make a test phase run faster, but to enable testing and fixing activities as a core part of the workflow. As someone works on changes to the system, whether that is to an application code or infrastructure definitions, they are continuously testing. People test so they can fix each problem as it is discovered, while they’re still working on their changes and everything is fresh in their mind. When the scope of changes are very small, the problems are quick to find and easy to fix.
Automating tests for fast feedback
Teams whose testing process is based around separate implementation and test phases often attempt to adopt automation by automating their test phase. This is often a project owned by the QA team, which aims to create a comprehensive regression test suite. In my experience, automated test suites built by a separate testing team tend to focus on high level testing, but the outcome can sometimes result in an unbalanced test suite.
The key to designing and implementing a well-balanced automated test suite is for the entire team, especially the implementers, to be involved in its planning, design and implementation. Big bang test automation initiatives often bite off more than they can chew, and struggle to keep up with ongoing development. The system is a constantly moving and changing target, and before the massive test suite is complete, the system has changed and shifted multiple times. Assuming the test suite can be completed, the system will change again immediately, meaning tests tend to be constantly broken, and the nirvana of a complete test suite is never achieved.
It is rarely effective to aim for the goal of a complete, finished test suite; the goal of an automation initiative should be to embed the habit of continuously writing tests as part of routine changes and implementation work. The outcome of an automated testing initiative is not a completed test suite, but a set of working habits and routines. When automated testing has been successfully adopted by a team, tests are written or updated whenever a change is made to the system. CI and CD regimes run the relevant tests for every change continuously, and the team responds immediately by fixing failing tests.
Organically building a test suite
The best way to start an initiative that results in embedding these kinds of testing habits is to write tests for each new change as it comes up. When a bug is found, write a test that exposes that bug, and then fix it. When a new feature or capability is needed, begin implementing tests as you go, possibly even using TDD. Building the test suite organically as a part of making routine changes forces everyone to learn the habits and skills of sustainable, continuous testing.
The outcome to aim for is not a “finished” test suite, but the routine of testing each change, and a comprehensive test suite will emerge from this approach. Interestingly, the test suite that emerges will be focused on the areas of the system that need tests more urgently and the ones which change and/or break the most.
Implementing automated infrastructure testing
There is a variety of tooling available to implement automated infrastructure testing, and in many cases, tools designed for software testing can be directly adopted and applied to infrastructure. Some of these tools have been extended to add infrastructure specific functionality; Serverspec, for example, extends the RSpec Ruby-based testing tool with features for checking server configuration. It’s important to avoid getting hung up on the tooling, however, and you should avoid choosing a tool and basing your entire testing strategy around it.
Instead, analyse the systems and components at hand to decide how you need to approach testing them, and then find tools to carry out your approach. As with any part of your infrastructure, you should assume that you will continuously change parts of your test tooling over time.
Roles and workflow for testing
Infrastructure teams tend to find testing a challenge, with the typical systems administrator’s QA process being: 1) make a change, 2) do some ad-hoc testing (if there’s time), 3) keep an eye on it for a little while afterwards. On the flip side, some testers don’t understand infrastructure very well, and as a result, most testing in IT operations tends to be at a fairly high level. One of the big wins of agile software development is the breaking down of silos between developers and testers, and rather than making quality the responsibility of a separate team, developers and testers share ownership. Similarly, rather than allocating a large block of time to test the system when it’s almost done, agile teams begin testing when they start coding.
There is still some disagreement over what the role of a QA (Quality Analyst) or tester should entail, even within an agile team, with some teams deciding that, since developers write their own automated tests, there is no need for a separate role. Personally, I find that even within a highly functioning team, QAs bring a valuable perspective and level of expertise for discovering the gaps and holes in what I build.
Automated testing is arguably the most challenging aspect of infrastructure as code, whilst also being the most important for supporting a reliable and adaptable infrastructure.
Teams should build the aforementioned habits and processes to routinely incorporate testing as a core part of their infrastructure, but should recognise that this will require the highest degree of openness to change.
Kief Morris, ThoughtWorker (opens in new tab) and author of Infrastructure as Code
Image Credit: watcharakun /Shutterstock