Versions Compared


  • This line was added.
  • This line was removed.
  • Formatting was changed.

This feature is available only to customers who have provided consent to the usage of SnowFlake database. To learn more please contact Customer Success.

Below you’ll find answers to the questions we get asked the most about Quality Analytics.


titleThe start date of the data shown in the chart is different than the report date range. Why?

In order to provide you with the ability to instantly create analytics reports SeaLights performs daily massive aggregations of your coverage data. To be able to support that, the start date for the data shown in the charts is currently limited to the following Monday or the 1st of the month.

When selecting 1 month interval for your trend or group trend report, the data shown in the chart will start at the 1st of the relevant month. For other interval options, the data shown in the chart will start at the following Monday.

If your sprints start on a Wednesday, for instance, and you wish to create a report for a specific sprint, you can use the trend report with the reference builds setting as a workaround. This will require you to mark the relevant sprint builds as reference builds and select the “Reference build” option under the Builds section in your trend report Filter left pane.

titleWhy are there differences between the number of methods shown in the report vs. the number shown in the Dashboard?

Quality Analytic reports comply with the ignored code rules that were defined by the user under Settings > Data Scope > Ignored Code. So while the Dashboard shows the total number of methods in the code, the report will show only the total number of un-ignored methods of code. TGA works in a similar way to Quality Analytics so when comparing the two - you should find the same number of methods reported.