Performing Test Gap Analysis
This article explains how to use Test Gap Analytics to improve your QA/Dev teams efficiency by guiding them where the would need to close test gaps identified at the end of the sprint, they will be able to plan their testing activities towards the next sprint.
Steps to Take Before You Start Using Test Gaps Analytics
Scope in the scanned code based on what matters to you
"Clean" the irrelevant data - see How to Ignore irrelevant code
Create a report scheduler
Align with your Sprint Schedules (or schedule a Monthly Report)
Add your App & Branch to the Test Gap Analytics view - You can pick any configured app and branch and create a time frame for the Test Gap report - see Adding an application to the Test Gap Analytics view
Install the SeaLights Code Viewer Chrome Extension on your personal computer with these instructions
Test Gaps Analytics Main Use Cases
There are three main use cases of working Test Gap Analytics described below.
Definition of Done for Sprint Quality
Validate that the code which was added in the last sprint is tested
TGA Range - Sprint schedule
Focus - Modified Code
Monthly Quality Report
Analyze the quality performance of your apps/teams
TGA Range - Calendar Month
Focus - Modified Code
Test Development
Identify the high-risk code areas and create a test plan based on concrete data (e.g. focus on Features, Classes or Functional areas)
TGA Range - Month / Quarter
Focus - Overall Untested Code
Tutorial: Definition of Done for Sprint Quality
Go to the TGA Report by clicking on the options menu in the upper right corner of the screen, and then click on "TGA Report"
Level 1 - Application
Identify your relevant app/branch
Review the test gaps summary of your app (high level)
The columns present the number of UNTESTED METHODS per category:
Modified + Used in Production
Modified
Used in Production
Overall untested (not effected from the reporting period)
The numbers in each column represent the total amount of untested methods
The percentage in each column represents the percentage of untested methods
Note: Used in production columns are optional and will be displayed in the case that the production agent is installed and reporting data
Adjust the reporting period
Click on the refresh icon to the right of the time range of the relevant app, then pick a new range of dates to be analysed
Drill down by clicking on the app name
Level 2 - Files
Review if Test Gaps are found in important areas of the code (e.g. Logic, Calculation, New code, Modified Code, etc…) and focus on those areas
Sort and filter in order to drill down into specific areas of your code which will help you identify the areas of code to focus on:
Use the "Search" bar to focus on important classes/files (Logic, Calculation, etc…)
Use the "Test Stage" drop-down button to review the untested files/methods per specific test stage. Focus on areas which are sensitive to Integration Tests:
Priority 0 - Untested by all test stages
Priority 1 - Untested by a specific test stage, i.e. Integration/Automation/Regression Tests (the rationale of this is to exclude the unit tests)
In case of a Legacy app (low code change volume), focus on the modified areas
Level 3 - Methods
Drill down into the relevant file by clicking the file name in the TGA report
Click on the SCM link for the given file you would want to explore further
You will be linked to your code repository and can access the specified file space within your SCM.
You can see the TGA insights in line with your code with the SeaLights "Code Viewer", part of SeaLights Chrome Extension Suite (see the section Steps to Take Before You Start Using Test Gaps Analytics)
SCM's supported by SeaLights Code Viewer– Github, BitBucket, Gitlab, TFS/TFVC
Use the test stage drop-down button to focus on a specific test stage
e.g. In order to focus on areas which are not tested by Manual Tests → pick the "Manual Tests" value from the drop-down
Review the file and Quality Risks and decide which of these should be addressed within the current/upcoming Sprint
Control and Feedback Loop
Reiterate the process in future sprints to ensure that the gaps have been actually closed by your Dev/QA team and identify if new test gaps have been created
Create a monthly report which includes the relevant sprint
References