Quality Analytics - FAQ

This feature is available only to customers who have provided consent to the usage of SnowFlake database. To learn more please contact Customer Success.

This feature is available only to customers who have provided consent to the usage of SnowFlake database. To learn more please contact Customer Success.

 

Below you’ll find answers to the questions we get asked the most about Quality Analytics.

In order to benefit from the Quality Analytics feature, you must first provide consent to the usage of SnowFlake database. Please contact Customer Success in order to get more information and advance the process.

There might be a few reasons for that:

  • When you are viewing a coverage trend report with 'All Builds' and an interval longer or equal to the date range (e.g 1 month interval on date range of 3 weeks).

  • When you are viewing a group coverage trend report with an interval longer or equal to the date range (e.g 1 month interval on date range of 3 weeks).

  • When you are viewing a coverage trend report with 'Reference Builds' and there is only one build selected.

  • When no builds were found in the selected date range.

Yes, when no builds were found for a specific interval point, these interval points will be presented in the x-axis but will be skipped on the chart line (chart line will become a dotted line).
Same goes when looking at the Modified Coverage chart, in group trend report and in coverage trend report (whether on ‘All Builds' mode or 'Reference Builds’ mode), and there were no code changes on one interval / reference build.

On coverage trend report > ‘All Builds' mode and in group coverage report the calculation is different than in the Dashboard. Interval X takes all code changes from the last build in interval X-1, and takes all coverage from all related test stages (for trend report) or from all selected apps/branches (for group coverage report) that were reported in all builds within interval X. (e.g. Interval of 1 month: coverage in July means all code changes since last build on June and their coverage from all test stages executions reported on builds within July, for trend report, or from all selected apps/branches, for group trend report).
On coverage trend report > 'Reference Builds’ mode the calculation is similar to the Dashboard calculation, when comparing between two builds.

On coverage trend report > 'Reference Builds’ mode the calculation is similar to the Dashboard calculation, when comparing between two builds.

On coverage trend report > ‘All Builds' mode, and on group coverage report, the calculation of each one of the intervals is similar to the TGA calculation. The main difference is that the quality analytics reports include multiple intervals. 

Yes. Deselecting a test stage in a trend report removes the test stage line from the chart and deducts the test stage data from the aggregated coverage line.

If you are using the TIA tool when running your test suites, you should expect drops in the overall coverage whenever TIA was ON and recommended running only a subset of your tests. The drops mean that your coverage was temporarily lower than usual.

You can easily set a build as a reference build on the Dashboard. When hovering over any of the builds in the list, a flag icon appears on the right side. Clicking this icon will set the build to be a reference build. Or you can use our APIs to do it automatically.

In order to provide you with the ability to instantly create analytics reports SeaLights performs daily massive aggregations of your coverage data. To be able to support that, the start date for the data shown in the charts is currently limited to the following Monday or the 1st of the month.

When selecting 1 month interval for your trend or group trend report, the data shown in the chart will start at the 1st of the relevant month. For other interval options, the data shown in the chart will start at the following Monday.

If your sprints start on a Wednesday, for instance, and you wish to create a report for a specific sprint, you can use the trend report with the reference builds setting as a workaround. This will require you to mark the relevant sprint builds as reference builds and select the “Reference build” option under the Builds section in your trend report Filter left pane.

Quality Analytic reports comply with the ignored code rules that were defined by the user under Settings > Data Scope > Ignored Code. So while the Dashboard shows the total number of methods in the code, the report will show only the total number of un-ignored methods of code. TGA works in a similar way to Quality Analytics so when comparing the two - you should find the same number of methods reported.