Test Automation Reporting To Test And Project Management

Blog

Test Automation Reporting To Test And Project Management

Imagine that you live in a perfect test automation world. You have automated all of the desired test cases. Your solution runs smoothly with each single build. You find issues with your Subject Under Test (SUT) but not with your test case design. You don’t have any flaky tests. Everything runs smoothly, reliable and repeatable. If that is the case then please leave a comment below and share your thoughts on how did you manage to achieve all of that 🤩.

Even if you have your complete test automation solution working flawlessly it means nothing if you cannot present the results in a meaningful way. Sending emails with statistics, excel files or XML files is not an efficient way to communicate the current state of your test automation efforts. You need to somehow let the both Project and Test Management know what the current state is.

Keeping everyone up to date

What is extremely important is that you have to provide transparency and constant information to the people in the management positions. Why? Because only having all of the data allows them to take a pro-active approach and react on possible delays, problems and challenges. At minimum you should provide the following data to the managers:
1. Progress of Test Automation – how many test cases from the automated ones are finished
2. Execution Results – Passed/Failed/Aborted/Canceled/Error for particular configuration set. That means:
– you need to provide an overview of the executed test cases for a particular SW Version
– you need to provide an overview of the executed test cases for a particular Test Environment
How it should look like? Well you need to make sure that everybody understands the data you are trying to show and naturally that everyone involved can access it.

Showing that you have executed 200/300 test cases doesn’t really help if you don’t know the context. However stating that you have: executed 200 test cases on SW Version 1.0.2.13 on Windows 10 IoT 10.0.17134.648 out of which 120 passed, 80 failed is already more helpful.

If you want to see how the data is presented with TestResults.io check out our documentation or schedule a short demo meeting.

Again context is the key here. If e.g. 100 test cases out of the 200 you have executed are just testing functions which are not really used by the users, then it will say nothing to the managers.
Grouping the results in meaningful sets will allow your managers to know what the current condition of the software is. Let’s place some examples on how it could look like.

The information about the current SW state is meaningful both to Project and Test Management
Breakdown of different sets (modules being tested) provides a context on what is working and what is not working

You can see that the more info you provide the easiest it is to determine the current state (quality) of your software.

Providing even more context

Both graphs show information which is definitely readable by the Project Management however based on those she/he cannot state how meaningful the test cases which failed or aborted are. It might be that the functionality behind the failed test cases is essential to the user or is the most commonly used functionality in the SUT. To make sure that your managers make the right call you should ensure that you group the functionalities in meaningful sets. Instead of breaking down the sets to show the tested functionality only, you can group them even further so that they provide more context. Following our example lets try to divide the sets based on the importance of the tested functionality.

You can already see that this gives a better indication of where you currently stand with your Software quality. In our example the Critical functionality which is essential for your release has a failed test case . This requires your immediate action.

It is no surprise that you need to present data to your managers. How you present it is a key factor here. On top of that make sure that each required stakeholder can view (and understand) your testing efforts. It will save you time and a lot of unnecessary discussions.

Leave your thought here

Your email address will not be published. Required fields are marked *