Are Test Plans a Waste of Time?

Acceptance Test Plan Template, Test Methodology

Image by IvanWalsh.com via Flickr

If you’re like many testers I know, you hate test plans. Most testers fall into three groups: Group A doesn’t like writing test plans, Group B thinks test plans are a waste of time, and Group C thinks groups A and B are both right.

Let’s start with Group A. Some testers don’t like writing test plans because they simply don’t enjoy writing. Their dislike has less to do with what they are writing, than the fact that they are writing. Perhaps writing does not come naturally to them. One tester recently told me, “I write like a third grader.” It’s no surprise he doesn’t like writing test plans–we like to do things we’re good at.

Some testers don’t enjoy writing test plans because they are not really testers at all–they are developers working in Test. They would much rather if someone else analyzed the functional specifications, wrote the test plan, and told them exactly what they needed to automate.

Like many testers, all of my previous jobs had been in development. I thought writing a test plan would be easy–until I saw my first test plan template. Performance testing? Stress testing? Long-haul testing? I quickly learned that I had no idea how to write a test plan, and I dreaded writing one. But I started reading plans from testers I respected, and I started reading books on testing. I slowly made the transformation from developer to tester. Now I enjoy writing test plans because I know it makes my testing better.

Perhaps the most interesting reason testers dislike test plans is because they don’t think the plans are useful. They point to the fact that most test plans are never updated. As the product evolves, test plans become out of sync with the current functionality. I’ve found this to be true. Of all the test plans I’ve written in my career, I know exactly how many I updated after the product changed: zero.

This is a problem for two reasons. First, it would be one thing if test plans were quick and easy to write; they are not. Depending on the feature, it can take me a week to write a solid detailed test plan. Some would argue this time could be better spent automating test cases or performing exploratory testing.

Even worse, test plans that are out of sync with product functionality give inaccurate information to the reader. Recently, I worked on a product that was being updated for its next release, and I was assigned to test a feature I was completely unfamiliar with. The first thing I did was review the original test plan to learn how the feature worked and how it was first tested. I assumed, incorrectly, that the document was up to date. As a result, I reported bugs even though the feature was working properly.

James Whittaker, Test Director at Google, recently debated the value of creating test plans on the Google Testing blog:

As to whether it is worth doing, well, that is another story entirely. Every time I look at any of the dozens of test plans my teams have written, I see dead test plans. Plans written, reviewed, referred to a few times and then cast aside as the project moves in directions not documented in the plan. This begs the question: if a plan isn’t worth bothering to update, is it worth creating in the first place?

Other times a plan is discarded because it went into too much detail or too little; still others because it provided value only in starting a test effort and not in the ongoing work. Again, if this is the case, was the plan worth the cost of creating it given its limited and diminishing value?

Some test plans document simple truths that likely didn’t really need documenting at all or provide detailed information that isn’t relevant to the day to day job of a software tester. In all these cases we are wasting effort.

I agree that there may be some wasted effort in writing a test plan. For instance, I’m guilty of spending too much time tweaking the wording of my documents. It’s a test plan, not a novel. Bullet points and sentence fragments are fine–you shouldn’t be spending time using a thesaurus to find synonyms for the word “feature”.

But that doesn’t mean it’s all wasted effort. In fact, I believe the benefit far outweighs the effort. This is true even if the test plan quickly becomes obsolete.

Consider feature specification documents: much like test plans, feature specifications often become outdated as the product evolves. This doesn’t this mean we shouldn’t write feature specifications. Potential document “staleness” is not a valid argument against writing spec documents—or test plans. Just don’t make the mistake of assuming old specifications are still accurate.

One of the most important reasons for creating a test plan is to get your thoughts about testing the feature onto paper and out of your head. This unclutters your mind and helps you think clearly. It also documents every idea you’ve had about testing the feature, so none are forgotten.

The writing process often leads you to think of both more and better ways to test the feature. Your first pass at writing a test plan may include mostly positive functional test cases, as well as a handful of negative functional tests. But the process of refining your document leads you to consider more negative tests, more boundary tests, more concerns around scalability and security. The more time you spend planning your testing, the more complete your testing will be.

Detailed test plans can also help you find bugs in the feature before the code has even been implemented, when they are much less costly to address. Two techniques that are excellent for finding design holes during the test planning phase are Decision Tables and State Transition Diagrams. I remember creating a large Decision Table as part of a test plan for a security feature which uncovered that nearly 10% of the possible input combinations didn’t have an expected result in the design specification document.

Test documents are also valuable for conducting test plan reviews. After creating your test plan, it’s important to have it reviewed by other testers–no single tester will ever think of all possible test cases. It’s also valuable to have your plan reviewed by both the developers and program managers. In my most recent test plan review, a program manager told me that one of the features I planned on testing had been cut. In the same review, a developer informed me about existing test hooks that saved me hours of development time.

When testers say they don’t want to write a test plan, I can sympathize. Most of us got into this business either because we like to program or because we like to break things, not because we like to write documents. But when testers say they don’t want to write a test plan because it’s not valuable, I have to disagree. Good test plans make your product better. So what if they become obsolete? As both Dwight D. Eisenhower and Wile E. Coyote once said, “plans are useless, but planning is indispensable.”

Advertisements

14 Responses

  1. I’ve certainly seen test plans that were a waste of time. I’ve also seen some that were very valuable. Maybe a better title would be “tell about some test plans that were not valuable”. Andrew does mention some good examples of test plan values and flaws.

    Usually bad test plans are bad because they don’t help the people they should. A good plan should start with an understanding of who the plan is for and what they will do with it. There should also be a clear understanding of how long the plan will be relevant.

    The idea of test plans does seem to be one that causes emotional reactions in testers. Most people seem to have strong opinions. And yet, there are many different types of test plans that can be very different.

    I’m interested in hearing more comments.

    Ralph Case

  2. Hi, most testers I know do absolutely not fall into one of your 3 groups! But, those testers are not developers, they are professional testers and wrting test plans is a complete part of our job: we either do it correctly or we change job… Let everyone do his job, a developer develops and a tester tests…
    Gauthier

    • sry but this is the old times way how to think things can be done , i’m not saying that test plans don’t give any value for anyone. I say that because the old meaning of test plans is too old and in today’s “real-time” world they take too much time ( all kind of actions – review, writing and so on ).
      First of all developers test and testers develop, if You cant read what developers write You are bad tester because You don’t know the product You are testing.
      Now a day Test Plan should be replaced with flow diagrams( most of it), Decision Tables and State Transition Diagrams . Diagrams can have links to requirements. Every application we build have requirements ( the way how we keep and manage them is other story) . Requirements are input for every Software Engineer ( Now a day’s developer and tester ) . Lets say the application has grown from 2 feature to 20 feature and You have been super diagram drawer and have for all of them application flow diagram. Now it comes out that 2 feature of 20 have been updated and need to be tested. Now You take your diagrams and take these two feature flows + dependencies (other services, features etc that are related ) and that is Your Testplan, no need to write same things all and all over , all your diagram items link to requirements or developed issues or what ever resource. And why I like this way to do things? It gives quick overview what these changes can affect , we don’t need that bubble talk.

  3. Many people confuse test plans with test cases. The example pic is the approach a test plan should take.

    I believe that it is very easy for a test plan to get out of control. A Test Plan should be just that, a high level plan that outlines; what we can do / not do. Identifies key teams that will perform certain functions, entrance exit criteria, dependencies and areas of risk concern, resources, basic timeline, special tools and expertise needed, possibly a capture of the requirements and a test case list, functional test categories for same etc. Depending on complexity / test case flux,it may be appropriate to refer to a separate requirements / test case matrix document or tool. Also, as a general rule, I do not do major re writes of the test plan after the project starts. My position is; if a major change has occurred that would require another test plan, then the project as a whole should be reset, or re-scooped to properly capture the new requirements/ test plan etc.

    Bottom line… the test plan must be AGILE, HA! goes against agile philosophy…. I digress… what I mean is the test plan must be high enough level, that is can be very easily changed as the project changes and key milestones introduced, dropped etc. Ex; if a test plan takes 1 week to write, you should be able to modify with in a day or so, if you can’t….its too low level. Consider breaking up into smaller plans or taking out low level detail that is changing often. A recent example is.. at a small start up I was working at, I was including the test schedule as a good indicator of “the plan”. The schedule was changing so often, that after my seventh rewrite of the test plan within 3 mths I stopped including the schedule. The next project resulted in only two rewrites of the test plan as the project progressed, although the schedule was in a constant state of flux… and I was getting more sleep at night.

  4. I find Gauthiers comment interesting. I myself am a tester, not a developer, and my experience is that developers doing test cling to test-plans, while ‘real testers’ know they are often a waste of time. I find most of the arguments for test plans above inconclusive, and I think the final sentence tells you why: plans are useless, but planning is indispensable. Whatever insights you get while writing the testplan, you will get much stronger and easier while testing. Of course, you may feel that in some specific context, it is helpful to take out a piece of paper and write something (e.g. a decision table), either because it helps you in your test-effort, or because there is nothing else to do, as the feature is not implemened for hands-on test yet, but this is part of testing, not a specific ‘test plan’ that you do independently of the context of testing.

  5. Typically Test Plans are created by the QA Leads an/or QA manager. Testers develop low level Test Cases to test the product features as well as satisfy the test plan objectives.

    I agree that testers should be given the flexibility and the proper support framework to do what they do best… which is to test. Testers often need to chase down problems that may not adhere to the original plan during ad hock or exploratory testing. The best testers I’ve known are also the ones most resistant to plans.

    However, it would be unwise not to focus the efforts of the QA team with a plan of some sort so you know your objectives.. and especially… to know when you are done. This is important in order to contain cost and specify minimum criteria for product launch. This doesn’t mean that you compromise the testing, but it does mean you have clear objectives. The saying “he who fails to plan is planing to fail” applies, although a bit harsh in this context. As testers we often create “our plans” on the fly by observing responses to actions, re evaluating next actions based on new data, etc to ferret out problems. For some companies using AGILE or SCRUM and the like… the plan is a living changing objective manged daily and may, or may not, be loosely documented. In these cases the important stake holders are creating the plan on the fly as new information is discovered… needles to say testers thrive in these environments.

    • Reading Paul McCoys comments made me aware that my comments above may be influenced by the fact that I am only tester in my team. Planning may of course be a good way of getting the necessary colaboration in a team, but I still find that in many cases a formal written test plan is a waste of time, and so should not be part of your default test process. I find that the ‘to know when you are done’ can be very dangerous. When you have done your testing, you should ask yourself ‘Am I done? Have I covered all the risks that testing the system has made me aware of?’ and not base your decision to call it a day on some list jotted down before you had the necessary insight into the risks..

      • Soren,
        I understand where you are coming from.. I have 30 yrs test experience and have never compromised my tests based on some test plan.
        I’m presenting a perspective that planning can help avoid misunderstandings in non Agile/SCRUM type environments where management typically will want to know at the beginning what criteria is being followed. When will the test start, when will it be done, how much will it cost etc. They don’t understand and can’t “manage” “it will be done when its done approach” (although in reality this is what actually takes place).
        With a plan (email to your boss etc) you can ease managements concerns, while at the same time protecting QA’s need and desire to ferret out problems that may take longer than planned.
        EX: criteria such as;
        • X features must be tested (IE no limit on number of tests)
        • It is expected to take x time (for one release test cycle, if no major problems found. IE if problems you are off the hook)
        • No Urgent or Very High defects can exist in product when released

        A plan is a tool. It can be very useful if done correctly, or counter productive if not done correctly or if its applied in the wrong environment

  6. I think we agree on these things. As I see it the test plans have two purposes: as a tool, to make it easier to execute the test and as a communication media between test and upper management and possibly between test management and ‘bread-testers’ in a larger test-setup (and possibly also as a repository of test-knowledge: communictiation between the same person at different times). A lot of parameters play into whether it is a good tool: some persons may get more out of it than me. When it comes to the communication side of it, I have experienced times when management demand an information level and control that is higher than what is good for the project. I do not question the right of management to take decisions, but some times a lot of effort is put to communicate things to management to macro-manage, that they, for the sake of the project, should leave to more specialized personel to micro-manage.
    .

  7. I almost never leave a response, however i did a few searching and
    wound up here Are Test Plans a Waste of Time? | Expert Testers.
    And I actually do have a couple of questions for you if
    you usually do not mind. Could it be simply me or does it appear like some of these remarks come across like they are written
    by brain dead visitors? 😛 And, if you are writing on other places, I’d like to follow everything fresh you have to post. Could you make a list of all of your social pages like your Facebook page, twitter feed, or linkedin profile?

  8. Thank you – I think 🙂 . I write all of my original articles here on Expert Testers. You can also follow me on Facebook at https://www.facebook.com/ExpertTesters and on Twitter at https://twitter.com/experttesters.

  9. Test Plans are not waste of time..
    But Test plan DOCUMENTS may be waste of time.. Planning can be done without test plan document..

    Read my long time written article:
    Difference between Test Plan and Test Strategy | Do we really need Test Plan documents?

    http://www.softwaretestingtimes.com/2010/04/difference-between-test-plan-and-test.html

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: