QA/Test Report Tool: Difference between revisions
From Yocto Project
Jump to navigationJump to search
Joshua Lock (talk | contribs) (Start to make a list of QA data sources) |
Joshua Lock (talk | contribs) (→Prior Art: add links to openQA) |
||
Line 24: | Line 24: | ||
==Prior Art== | ==Prior Art== | ||
* QA maintains [[QA_sanity_history]] reports which list sanity test results and compares them to the previous run (anyone know how these are generated?) | * QA maintains [[QA_sanity_history]] reports which list sanity test results and compares them to the previous run (anyone know how these are generated?). | ||
* Someone (?) generates graphs of the [https://wiki.yoctoproject.org/charts/perf_milestone/performance_test.html performance data] | * Someone (?) generates graphs of the [https://wiki.yoctoproject.org/charts/perf_milestone/performance_test.html performance data]. | ||
* [https://github.com/fullvlad/CustomReports/ CustomReports] includes visualisation for test run pass rates and graphs changes to success rates. | * [https://github.com/fullvlad/CustomReports/ CustomReports] includes visualisation for test run pass rates and graphs changes to success rates. | ||
* [https://jenkins.io/ Jenkins] parses and displays JUnit XML output out of the box, a break down of [http://nelsonwells.net/2012/09/how-jenkins-ci-parses-and-displays-junit-output/ How Jenkins Displays JUnit output] | * [https://jenkins.io/ Jenkins] parses and displays JUnit XML output out of the box, a break down of [http://nelsonwells.net/2012/09/how-jenkins-ci-parses-and-displays-junit-output/ How Jenkins Displays JUnit output]. | ||
* [https://openqa.opensuse.org/ OpenQA] from openSUSE has a visual status overview for the tests run vs an installed OS. Code is on github: https://github.com/os-autoinst/openQA and initial release announcement provides an overview: [https://news.opensuse.org/2011/10/11/opensuse-announces-first-public-release-of-openqa/ openSUSE Announces First Public Release of openQA]. |
Revision as of 13:59, 13 July 2016
General Expectations
Here's a list of initial expectations of the Test Reporting Tool.
- The TRT must be able to receive test reports via a remote communication protocol.
- From expected sources of results (Auth)
- To existing test buckets (logical test belonging separation, could be by test component)
- Using existing test report protocols like XML
- The TRT should present an interface with views organized towards the following objectives:
- Test Buckets (browsing a component's test results history)
- Test Collections (The type of Milestone/Release or another defined collection)
- The TRT should be able to identify the type of result it is consuming.
- Boolean test results Passed/Failed
- Numeric test results (measurements)
QA Data Sources
There are several sources of quality data we'd like to be able to track in a QA dashboard:
- oe-selftest results
- bitbake-selftest results
- build performance metrics
- buildhistory trends
- ???
Prior Art
- QA maintains QA_sanity_history reports which list sanity test results and compares them to the previous run (anyone know how these are generated?).
- Someone (?) generates graphs of the performance data.
- CustomReports includes visualisation for test run pass rates and graphs changes to success rates.
- Jenkins parses and displays JUnit XML output out of the box, a break down of How Jenkins Displays JUnit output.
- OpenQA from openSUSE has a visual status overview for the tests run vs an installed OS. Code is on github: https://github.com/os-autoinst/openQA and initial release announcement provides an overview: openSUSE Announces First Public Release of openQA.