Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 6 Next »

This article explains how to use Test Gap Analytics to improve your QA/Dev teams efficiency by guiding them where the would need to close test gaps identified at the end of the sprint, they will be able to plan their testing activities towards the next sprint.

  • Steps to Take Before You Start Using Test Gaps Analytics:

    1. Scope in the scanned code based on what matters to you

    2. Create a report scheduler

      • Align with your Sprint Schedules (or schedule a Monthly Report)

    3. Add your App & Branch to the Test Gap Analytics view - You can pick any configured app and branch and create a time frame for the Test Gap report - see Adding an application to the Test Gap Analytics view

    4. Install the SeaLights Code Viewer Chrome Extension on your personal computer with these instructions

  • Use Cases
    The three main use cases of working Test Gap Analytics are:

    1. Definition of Done for Sprint Quality

      • Validate that the code which was added in the last sprint is tested

        • TGA Range - Sprint schedule

        • Focus - Modified Code

    2. Monthly Quality Report

      • Analyze the quality performance of your apps/teams

        • TGA Range - Calendar Month

        • Focus - Modified Code

    3. Test Development

      • Identify the high-risk code areas and create a test plan based on concrete data (e.g. focus on Features, Classes or Functional areas)

        • TGA Range - Month / Quarter

        • Focus - Overall Untested Code

Tutorial: Definition of Done for Sprint Quality

Go to the TGA Report by clicking on the options menu in the upper right corner of the screen, and then click on "TGA Report"

Level 1 - Application

  1. Identify your relevant app/branch

  2. Review the test gaps summary of your app (high level)

    1. The columns present the number of UNTESTED METHODS per category:

      1. Modified + Used in Production

      2. Modified

      3. Used in Production

      4. Overall untested (not effected from the reporting period)

    2. The numbers in each column represent the total amount of untested methods

    3. The percentage in each column represents the percentage of untested methods

    4. Note: Used in production columns are optional and will be displayed in the case that the production agent is installed and reporting data

  3. Adjust the reporting period

  1. Click on the refresh icon to the right of the time range of the relevant app, then pick a new range of dates to be analysed

  2. Drill down by clicking on the app name


Level 2 - Files

  1. Review if Test Gaps are found in important areas of the code (e.g. Logic, Calculation, New code, Modified Code, etc…) and focus on those areas

  2. Sort and filter in order to drill down into specific areas of your code which will help you identify the areas of code to focus on:

    1. Use the "Search" bar to focus on important classes/files (Logic, Calculation, etc…)

    2. Use the "Test Stage" drop-down button to review the untested files/methods per specific test stage. Focus on areas which are sensitive to Integration Tests:

      1. Priority 0 - Untested by all test stages

      2. Priority 1 - Untested by a specific test stage, i.e. Integration/Automation/Regression Tests (the rationale of this is to exclude the unit tests)

    3. In case of a Legacy app (low code change volume), focus on the modified areas

Level 3 - Methods

  1. Drill down into the relevant file by clicking the file name in the TGA report

  2. Click on the SCM link for the given file you would want to explore further

  3. You will be linked to your code repository and can access the specified file space within your SCM.

    1. You can see the TGA insights in line with your code with the SeaLights "Code Viewer", part of SeaLights Chrome Extension Suite (see the section Steps to Take Before You Start Using Test Gaps Analytics)

    2. SCM's supported by SeaLights Code Viewer– Github, BitBucket, Gitlab, TFS/TFVC

  4. Use the test stage drop-down button to focus on a specific test stage

    1. e.g. In order to focus on areas which are not tested by Manual Tests → pick the "Manual Tests" value from the drop-down

  5. Review the file and Quality Risks and decide which of these should be addressed within the current/upcoming Sprint

  6. Control and Feedback Loop

    1. Reiterate the process in future sprints to ensure that the gaps have been actually closed by your Dev/QA team and identify if new test gaps have been created

    2. Create a monthly report which includes the relevant sprint


Appendix A - Cleaning and Ignoring irrelevant code
Appendix B - Adding an application to the Test Gap Analytics view

  • No labels