Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

This article explains how to use Test Gap Analytics to improve your app quality and close your test gaps.

  • Steps to Take Before You Start Using Test Gaps Analytics:

    1. Scope in on what matters to you

      • "clean" the irrelevant data - see 

      How to Ignore irrelevant code

    2. Create a report scheduler

      • align with your Sprint Schedules (or schedule a Monthly Report)

      1. Add your App & Branch to the Test Gap Analytics view - You can pick any configured app and branch and create a time frame for the Test Gap report - see Adding an application to the Test Gap Analytics view.

    3. Install the SeaLights Code Viewer Chrome Extension on your personal computer with these instructions.

  • Use Cases
    The three main use cases of working Test Gap Analytics are:

    1. Definition of Done for Sprint Quality

      • Validate that the code which was added in the last sprint is tested

      1. TGA Range - Sprint schedule

      2. Focus - Modified Code

    2. Monthly Quality Report

      • Analyze the quality performance of your apps/teams

      1. TGA Range - Calendar Month

      2. Focus - Modified Code

    3. Test Development

      • Identify the high-risk code areas and create a test plan based on concrete data (e.g. focus on Features, Classes or Functional areas)

      1. TGA Range - Month / Quarter

      2. Focus - Overall Untested Code

Tutorial: Definition of Done for Sprint Quality

Go to the TGA Report by clicking on the options menu in the upper right corner of the screen (below your name), and then click on "Test Gap Analytics"

Level 1 - Application

...

  1. Identify your relevant app/component

  2. Review the test gaps summary of your app (high level)

    1. The columns present the number of UNTESTED METHODS per category:

      1. Modified + Used in Production

      2. Modified

      3. Used in Production

      4. Overall untested (not effected from the reporting period)

    2. The numbers in each column represent the total amount of untested methods

    3. The percentage in each column represents the percentage of untested methods

    4. Note: Used in production columns are optional and will be displayed in the case that the production agent is installed.

  3. Adjust the reporting period

...

  1. Click on the refresh icon to the right of the time range of the relevant app, then pick a new range of dates to be analyzed.

  2. Drill down by clicking on the app name

...


Level 2 - Files

  1. Review if Test Gaps are found in important areas of the code (e.g. Logic, Calculation, New code, Modified Code, etc…) and focus on those areas

  2. Sort and filter

...

  1. Sort and filter in order to drill down into specific areas of your code which will help you identify the areas of code to focus on

    1. Use the "Search" bar to focus on important classes/files (Logic, Calculation, etc…)

    2. Use the "Test Stage" drop-down button to review the untested files/methods per specific test stage. Focus on areas which are sensitive to Integration Tests.

      1. Priority 0 - Untested by all test stages

      2. Priority 1 - Untested by Integration/Automation/Regression Tests (the rationale of this is to exclude the unit tests)

    3. In case of a Legacy app (low code volume), focus on the modified areas (orange column in the TGA report)


Level 3 - Methods

...

  1. Drill down into the relevant file by clicking the file name in the TGA report

  2. You will be linked to your code repository and can access the specified file space within your SCM.

    1. You can see the TGA insights in line with your code with the SeaLights "Code Viewer", part of SeaLights Chrome Extension Suite (see the section Steps to Take Before You Start Using Test Gaps Analytics)

    2. SCM's supported by SeaLights Code Viewer– Github, BitBucket, Gitlab, TFS/TFVC

  3. Use the test stage drop-down button to focus on a specific test stage

    1. e.g. In order to focus on areas which are not tested by Manual Tests → pick the "Manual Tests" value from the drop-down

  4. Review the file and Quality Risks and decide which of these should be addressed within the current/upcoming Sprint

  5. Control and Feedback Loop

    1. Reiterate the process in future sprints to ensure that the gaps have been actually closed by your Dev/QA team and identify if new test gaps have been created.

    2. Create a monthly report which includes the relevant sprint.

...



Appendix A - Cleaning and Ignoring irrelevant code
Appendix B - Adding an application to the Test Gap Analytics view