Performance Test Documents

After reading Andrew’s article on test plans I started thinking about my own experiences with writing test documents. In this article I’ll describe the different types of performance test documents my team creates.

Test Strategy Documents

The first type is high-level test strategy documents. One of my team’s responsibilities is to provide guidance on performance tools, techniques, infrastructure, and goals. We document this guidance and share it with our peer feature teams. These teams use the guidance when planning out specific test cases for their features.

Strategy documents provide value in a number of ways. They help my team gain clarity on strategy, they act as documentation that feature teams can use to learn about performance testing, and they assist us in obtaining sign-off from stakeholders.

I wrote one strategy document this year and it provided all of the above. I still occasionally refer to it when I’m asked about the performance testing strategy for our organization.

Test Plan Documents

Besides creating strategy documents, we also write more traditional test plan documents. These documents define a set of tests that we intend to execute for a project milestone. They include details of the tests, features that will be covered, the expected performance goals, and the hardware and infrastructure that we will use.

Similar to strategy documents, test plans help us gain clarity on a project and act as a springboard for stakeholder review. They seem to have a shorter shelf life though — I don’t find myself reviewing old performance test plans. My approach has been to “write it, review it, and then forget about it.”

Interestingly enough, I do find myself reviewing old test plans authored by other teams. Occasionally we need to write performance tests for a feature that we’re unfamiliar with. The first thing I do is review old functional test plans to understand how the feature works and what the feature team thought were the most important test scenarios. These test plans are invaluable in getting us ramped up quickly.

Result Reports

When my team completes a milestone I like to write a report that details the results and conclusions of the performance testing. These reports contain performance results, information about bugs found, general observations, and anything else I think might be useful to document. I send the final report to stakeholders to help them understand the results of testing.

One thing I really like about these reports is that they help me figure out which types of testing provided the most value. They also help me figure out how we can improve. When I start planning a new milestone, I first go through the old reports to get ideas.

Wrapping Up

Documentation isn’t always fun but I do find that it provides value for me, my team, and the organization. I’d like to pose a question to readers — what types of test documents do you create, and how do they provide value?

Thanks for reading!

– Rob

3 Responses

  1. Your approach to test documentation is sound. I really like the formal “Test Strategy” document approach. Typically I include a scaled down version of the test strategy in the test plan that is more or less the end result. Your approach would allow more elaboration etc.. kudos !

    Early in my career I thought testing was the most important endeavor, the only thing that mattered, and that reporting was a barely tolerable necessary evil. Now that I have 30+ yrs under my belt I realize testing is nothing without reporting the results. Managers need information early, and often incrementally, to adjust and make key decisions during the projects lifecycle. In the last 10 yrs I use tools that report test progress/defects etc in real time. This allows me to manage many resources locally and offshore, as well as hundreds of tests in play at the same time and shift resources, test load etc as often as needed.
    Depending on project complexity, setting Mini Milestones may be helpful to gauge incremental progress and if a decision is made to drop a feature etc it may be helpful as well.

    Result reports are great. Your format seems to have a lot of great info that I am sure is a wealth of important info, but I have found it’s hardly read by senior management. I have adopted a Requirements Coverage Report as an excellent way to summarize the 10 or 20 most important “need to know” areas for senior management to actually read and digest and make key go no-go launch decisions. Usually they aren’t interested in all of the detailed data, they just want to know; what requirements were tested, which ones weren’t -and why not-, major defects, etc etc, however if they need more report data… it is available in other reports, documents, tools and formats etc that can be called up as required. As soon as I adopted this approach, the endless debates and discussions typical of eng during test report meetings were greatly minimized so the team could concentrate on the most important data.

    A separate “Lessons learned” doc is also helpful. A candid look at the team, what worked and what didn’t, observations and recommendations. Just as you review past documentation before starting another, the team should review lessons learned, then factor that into the test strategy doc and ultimately into the test plan. In fact I try to make it a project pre requisite before a project can begin x doc from the previous project(s) must be reviewed first.

    Great site / article, I’m glad I came across it.

  2. Thanks Paul! I appreciate the comments.

    I still struggle with reporting. Sometimes I get so wrapped up in the actual testing phase of a milestone that I forget that people need to know what we’re doing. 🙂 And sometimes I’m just plain lazy but I force myself to do it because I know it provides value.

    I try to keep audience in mind when writing the result reports. I agree that presenting a “wall of data” will be ignored by most. People are interested only in highlights.

    Lessons learned are definitely a critical thing to document. I didn’t highlight it in this article, but I try to provide opportunities for improvement as a section of the result report. On larger features or projects we usually have a post-mortem meeting across the disciplines (Dev/Test/PM/Ops) and someone creates a separate post-mortem document. If I remember correctly, I also provided some lessons learned in the introduction of a test plan document, to help set context around strategy. I think it’s key to learn from prior successes and failures.

    • What you’re referring to is what many calls Programming by ittonnien (see my home page link for a brief explanation). Maybe Intention Driven Development would be a better name, but then again the same problem as your name suggestion is suffering from that name doesn’t reveal the important fact that we’re constructing tests. So, I think I’ll stick with TDD.The comment you’re quoting on the other hand is obviously missing the point, but that’s a different topic.Cheers!

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: