The first time testers can impact a feature is often during functional and design reviews. This is also when we make our first impression on our co-workers. If you want to make a great initial impact on both your product and peers, you have to be an effective reviewer.
In my seven years in testing, I’ve noticed that many testers don’t take advantage of this opportunity. Most of us fall into one of four categories:
- Testers who pay attention during reviews without proving feedback. This used to be me. I learned the feature, which is one of the goals of a review meeting. A more important goal, however, is to give feedback that exposes defects as early as possible.
- Testers who push-back (argue) at seemingly every minor point. Their goal is to increase their visibility and prove their worth as much as it is to improve the product. They learn the feature and can give valuable feedback. However, while they think they’re impressing their teammates, they’re actually frustrating them.
- Testers who attend reviews with their laptops open, not paying attention. If this is you, please stop; no one’s impressed with how busy you’re pretending to be.
- Testers who pay attention and learn the feature, while also providing constructive feedback. Not only do they understand and improve the feature, but they look good doing it. This can be you!
How do you do this, you ask? With this simple recipe that only took me four years to learn, I answer.
1. Read the Spec
Before attending any functional or design review, make sure you read the documentation. This is common sense, but most of us are so busy we don’t have time to read the specs. Instead, we use the review itself to learn the feature.
This was always a problem for me because although I learned the feature during the review, I didn’t have enough time to absorb the details and give valuable feedback during the meeting. It was only afterwards when I understood the changes we needed to make. By then it was too late–decisions had already been made and it was hard to change people’s minds. Or coding had begun, and accepting changes meant wasted development time.
A great idea Bruce Cronquist suggested is to block out the half hour before the review meeting to read the spec. Put this time on your calendar to make sure you don’t get interrupted.
2. Commit to Contributing
Come to every review with the goal of contributing at least one idea. Once I committed to this, I immediately made a bigger impact on both my product and peers. This strategy works for two reasons.
First, it forces you to pay closer attention than you normally might have. If you know you’ll be speaking during the meeting, you will pay closer attention.
Second, it forces you to speak up about ideas you might otherwise have kept to yourself. I used to keep quiet in reviews if I wasn’t 100% sure I was right. Even if I was almost positive, I would still investigate further after the meeting. The result was that someone else would often mention the idea first.
3. Have an Agenda
It’s easy to say you’ll give a good idea during every review, but how can you make sure you’ll always have a good idea to give? For me, the answer was a simple checklist.
The first review checklist I made was to make sure features are testable. Not only are testers uniquely qualified to enforce testability, but if we don’t do it no one will. Bringing up testability concerns as early as possible will also make your job of testing the feature later-on much easier. My worksheet listed the key tenets of testability, had a checklist of items for each tenant, and room for notes.
At the time, I thought the concept of a review checklist was revolutionary. So much so, in fact, that I emailed Alan Page about it no less than five times. I’m now sure Alan must have thought I was some kind of stalker or mental patient. However, he was very encouraging and was kind enough to give the checklist a nice review on Toolbox–a Microsoft internal engineering website. If you work at Microsoft, you can download my testability checklist here.
I now know that not only are checklists the exact opposite of revolutionary, but there are plenty of other qualities to look for than just testability.
Test is the one discipline that knows about most (or all) of the product features. It’s easy for us to find and identify inconsistencies between specs, such as when one PM says the product should do X, while another PM says it should do Y. It’s also our job to be a customer advocate. And we need to enforce software qualities such as performance, security, and usability. So I decided to expand my checklist.
My new checklist includes 50 attributes to look for in functional and design reviews. It’s in Excel format, so you can easily filter the items based on Review Type (Feature, Design, or Test) and Subtype (Testability, Usability, Performance, Security, etc.)
If there are any other items you would like added to the checklist, please list them in the comments section below. Enjoy!