There's an art to reporting QA metrics to stakeholders. It's a people skill. View it as an opportunity to build consensus, not a way to appear clever, confuse stakeholders, or disguise issues.
Whether you're developing and testing software for your own product or you're a digital agency testing for your client, it makes no difference - the product owners want to know how the Minimum Viable Product (MVP) is looking as well as each major release. Today, we're looking at how you can decide which QA metrics to surface to your stakeholders, with some suggestions about the numbers they'll find most relevant.
How to decide what matters
There are a few of basic questions to start with:
- Does this metric impact core client concerns like budget, timelines or requested features?
- Can one metric represent general trends in place of multiple metrics?
- Is there anything you definitely need to include, or anything that is lower priority?
Once you've done some initial triaging with these questions in mind, look at the QA metrics you have left.
First of all, don't base your decisions on emotions or ego. Just because your velocity metric shows that there are, err... opportunities to improve testing or development efficiency, that doesn't mean you should hide it. In fact, it might mean you need to have a conversation with the client about more resourcing or perhaps including some test automation. Give them the context they need to make decisions. For example:
'This epic has 25 issues/ tickets with a tolerance of five and will probably need five to ten bug fixes. Our velocity is 40 issues per sprint, so we anticipate we can complete this epic in one sprint, including in-house and third-party testing. Our sprints last two weeks.'
Next up: include anything around speed or performance metrics and the testing processes that may be less obvious (but that absolutely must happen). Often, these foundational elements are necessary to get right before moving on to feature requests.
Finally, try not to include anything that requires detailed technical knowledge to understand. Avoid jargon and acronyms. We're not saying be vague, just look for what you can report on in a clear and concise way. When in doubt, give context:
'We found opportunities to improve application speed by testing top wait times for retrieving data from memory, which currently stand at x.xx seconds.'
Which QA metrics to surface
Here are some QA project management metrics we suggest bringing to the table:
- Your position on the top-level roadmap, and recent or upcoming testing milestones.
- If you run an Agile testing process, your progress at a story or epic level (not tasks!).
- Cumulative flow, showing testing velocity, backlog and current activity.
- Number of bugs found, in progress and solved, highlighting blockers and high severity issues.
- Some detail in your client's priority areas, such as performance, security, or adaptability.
A final note on how to "speak stakeholder"
When communicating updates to product owners, follow three key tenets:
- Check your math. Check it again.
- Have a relationship manager run the meeting.
- Always provide context for metrics, ideally with a visual aid.
You'll notice we've assumed you will be communicating these QA metrics in a meeting. Do not be tempted to send some kind of automated data aggregation in a weekly email (looking at you, developers and managers).
The point is that you've tailored your QA metrics so they're precisely what's important to the stakeholder. So, relay that information with the care and attention it deserves, and you will reap the rewards.