I recently stumbled upon the following tweet by Ben Dowen (a.k.a. The Full Snack Tester, the person behind the Tester Of The Day website):

Usually, I don't pay much attention to informal Twitter polls with a small sample rate since the results aren't the most trustworthy. But for this poll about test plans, I was a bit surprised. Out of a little over 100 people who voted, over 70% selected that test plans are either essential or useful, while barely over 15% chose that they're a waste of time.

These results surprised me because, in my experience, it's always been the opposite. In most places I've worked at and with most people I've interacted with on a professional level, they would have chosen that test plans are a complete waste of time. Maybe I run in different circles than most testers due to my everyday work. Most of my career has focused on software engineering and coding, so I admittedly have not been as deeply involved in the creation or use of test plans at work.

Almost every single organization I've worked for has never used a formal test plan for their projects. Based on many of the tweet replies, it seems most testers don't use a test plan either. It posed the question in my mind: Do teams really need a test plan?

The problems with test plans

On the surface, test plans sound like an excellent idea. For instance, here's the definition of a test plan taken from Wikipedia:

A test plan is a document detailing the objectives, resources, and processes for a specific test for a software or hardware product. The plan typically contains a detailed understanding of the eventual workflow.

This definition covers what most testers and QA departments would want as part of their organization. Everyone wants a single source of truth documenting the project's goals and the necessary steps to get there. We all want to have a comprehensive understanding of the work that needs to happen to get results, feeling that we're on the same page.

Unfortunately, the real world doesn't work that way. Despite our best intentions, most test plans are a burden on the team at best or an utter waste of effort and resources at worst. Of course, plenty of organizations are successful with their test plans, but I'm willing to bet they're the exception, not the rule. Creating a test plan requires lots of work and long-term thinking to make it useful during the project's lifespan.

Possibly the main issue with test plans is how easily they become out of date with the project - sometimes as soon as the team finishes creating it. It's especially evident in agile teams, where the project shifts and changes every week. Unless you have someone on staff who can spend most of their day keeping the test plan updated, you'll find yourself with documents that don't reflect the reality of the project.

Another problem with a test plan is most organizations underestimate how much effort is required to build and maintain one. The responsibility of a test plan shouldn't fall to a single person, meaning that creating a test plan will take a few people away from other high-value tasks. Meetings alone can take away countless hours of work. Requiring others to sign off on the plan consumes an incredible amount of energy. Depending on your team size, the creation of a test plan can turn into a multi-week process.

The worst mistake any team can make regarding test plans is when they're created solely to appease management. You'd be surprised at how many QA managers or other higher-ups create these kinds of documents to give them a feeling of control. It doesn't matter that it's taking away valuable time and focus from the team. Worse yet, these managers use these plans as a crutch to manage people and estimate timelines - often due to not paying close attention to the work their team does daily. Since test plans become outdated quickly, using them to manage and estimate can mislead others to go in the wrong direction.

Who could benefit from a test plan?

Despite the issues mentioned above - and countless others that you might come up with - it doesn't mean test plans are utterly useless. Test plans still have their place in some work environments.

Some industries require test plans because they're heavily regulated and high-risk. Some fields that come to mind are aerospace organizations, pharmaceutical companies, and nuclear power plants. It's understandable why these industries would require test plans - I'm sure you don't want to fly on a new plane with half-tested software or take new medicine that hasn't been studied extensively. In these scenarios, test plans provide accountability for their business and traceability in case something goes wrong.

Another area where test plans can work well is in large organizations with thousands of employees, different departments, and dozens of projects. In these places, a test plan can serve in multiple ways. It can become a way to sync the testing efforts of different groups in a consistent manner. A good test plan can also help spread information and knowledge across team members by demonstrating best practices and areas working well in similar projects.

Besides specific work environments, some projects can benefit from test plans, particularly complex ones with many moving parts. For example, hardware devices often have a software component as part of the package. Testers will need to ensure that both elements work well with each other. Another example is large web applications spanning multiple microservices that need to work in conjunction with each other. A test plan helps with keeping track of these complicated testing scenarios.

Who shouldn't bother with a test plan?

If your project or organization doesn't fall in any of the previously mentioned scenarios - working in a regulated industry or large organization or dealing with complex projects - you likely don't need any formal test plan. You can spend the required time and effort to create one if you want, but you can use your resources in better ways.

For small teams and startups, I would argue that test plans are entirely unnecessary. With just a few employees or groups, your communication barriers are low, meaning it's easy to discuss how everyone can pitch in with testing efforts. Also, your project's scope likely isn't large enough to require a formal plan, either. In an agile environment, a test plan can be a hindrance to the team's velocity, as well.

Medium-sized teams might be more inclined to have a test plan since there are more people to manage and a more complex workflow than with a smaller organization. But as I mentioned earlier, most people severely underestimate the effort needed to get a functional test plan up and running, not to mention maintaining it for the long haul. The amount of time required to have and keep a reliable plan isn't worth the effort in these cases.

You still need to keep track of your testing

Even with all the issues around test plans, it doesn't mean you should have no plan whatsoever. All testers need to communicate the work that needs to get done to ensure a successful outcome. Otherwise, the team's efforts will be misaligned with the bigger picture, and your organization will waste a lot of time with a low-quality product.

My recommendation for most smaller or medium teams with simpler projects is to keep it casual and informal. Instead of investing time and money in test case management tools like TestRail or Zephyr, use a shared spreadsheet or checklist between the team. These tools don't involve a lot of effort to use, making it dead-simple for anyone to contribute to its creation and maintenance. It doesn't sound sexy, but it's incredibly useful.

Whenever I recommend this approach, one of the primary arguments I receive is that these tools don't provide reporting or traceability. If your team requires fancy graphs and seeing who specifically tested something, these tools won't work, and you should seek something more suitable for your needs. However, something that most teams aren't willing to admit is that they don't need reporting or traceability when testing their product.

You've likely worked someplace where management wanted you to compile a report that no one really wanted. For example, I once was asked to build a dashboard to aggregate test automation data from the organization's different projects (test coverage, success rate, and so on). A few months after I released the dashboard to the team, I checked the site's analytics and discovered only three people had visited the dashboard after the first day. Eventually, the project became an abandoned wasteland.

The same happens with test plans and the data that comes from them. Some managers want to have a test plan but never use it for anything other than their peace of mind. The main takeaway with this article is not to have a formal test plan for the sake of having one. If all you want is shared alignment on what to test, a simple checklist or spreadsheet will have you covered. Anything else is overkill.

Summary

Test plans are one of those things that most testers and QA departments talk about needing to work effectively. However, in most cases, there's not much use for a team or organization to spend days, weeks, or months coming up with a test plan. While test plans sound good on paper, the reality is that they're a burden to create and maintain.

Test plans become outdated exceptionally quickly and require a tremendous amount of effort to keep up to date. They also take up much more time than expected. It's not uncommon to have multiple people spend hours in meetings coming up with their organization's test plans. Worst yet, some managers want their team to come up with a test plan solely to have one, without using it to guide everyone's work.

Some teams need to spend the time required to have a reliable test plan in place. Highly regulated industries require ways to hold their testing protocols accountable and their testers identifiable. Large organizations can benefit from the shared knowledge that a formal test plan provides. Also, test plans help with keeping track of testing projects with lots of moving pieces.

However, most organizations and projects don't fall in any of the previously mentioned categories and likely don't need any formal test plans in place. Small teams commonly have smaller projects and tighter teams that can easily discuss what needs testing. Medium-sized teams also won't benefit from spending time on test plans since the effort will likely outweigh the benefits for them.

In most cases, informal ways to check your testing will work well, like keeping track of what you test on a spreadsheet or checklist. If you argue that these simple tools can't help with reporting or traceability, you should take some time to think if your organization truly needs these. You shouldn't get surprised when you discover that you really don't need reports or knowing who tested what on your project.

Regardless of your organization or project, keep in mind that the goal for most businesses is to deliver high-quality products fast. A test plan can push you towards that ultimate goal, or it can drive you farther away from it. Think about your scenario, and choose wisely.

Have test plans helped you and your team, or have they created more problems than they're worth? Share your stories in the comments section below!