QA/Automated Test Report Template: Difference between revisions
Line 26: | Line 26: | ||
<?xml version="1.0" ?> | <?xml version="1.0" ?> | ||
<metadata> | <metadata> | ||
<hostname>mqz-osx-suse64</hostname> | |||
<distro> | |||
<pretty_name>poky 2.2</pretty_name> | |||
<id>poky</id> | |||
<version_id>2.2</version_id> | |||
</distro> | |||
<host_distro> | |||
<id>opensuse</id> | |||
<version_id>42.1</version_id> | |||
<pretty_name>openSUSE Leap 42.1 (x86_64)</pretty_name> | |||
</host_distro> | |||
<layers> | |||
<layer name="meta"> | |||
<commit>c8bd57d7cf127b5ee315597408940870882a10e0</commit> | |||
< | <commit_count>34619</commit_count> | ||
<branch>feature/oeqa-metaxml</branch> | |||
</layer> | |||
<layer name="meta-selftest"> | |||
<commit>c8bd57d7cf127b5ee315597408940870882a10e0</commit> | |||
<commit_count>34619</commit_count> | |||
<branch>feature/oeqa-metaxml</branch> | |||
</layer> | |||
<layer name="meta-foobar"/> | |||
</layers> | |||
<bitbake> | |||
<commit>058f8517c041b80e8b591ad7d34a68281b2d03fc</commit> | |||
<commit_count>6577</commit_count> | |||
<branch>master</branch> | |||
</bitbake> | |||
<config> | |||
<variable name="BB_NUMBER_THREADS">2</variable> | |||
<variable name="MACHINE">qemux86</variable> | |||
<variable name="PARALLEL_MAKE">-j 2</variable> | |||
</config> | |||
</metadata> | </metadata> | ||
</pre> | </pre> |
Latest revision as of 14:13, 13 January 2017
Introduction
The test reporting in automation can enable further processes to be automated too. Specifically, results collection should be done by consumers that require no manual interaction to be fed with results data. Just as any other process flow, minimizing the number of input consumers is translated into immediate efficiency as the flowing data is standardized.
Test and Performance Standard Report
Performance measurements and software testing share steps in their processes but they have an important difference in the results they both provide: the performance results are expressed in scalar quantities while the testing output is expressed in a predefined set of results.
Under the premise that most of the information in a performance and a testing report is alike, being the results notation the main difference; it should be feasible to use a single report specification that accounts for the aforementioned differences.
JUnit-XML Performance/Testing Report
The QA testing has moved towards embracing the JUnit-XML reporting standard and there is work in progress to note the required extensions to use that format with performance measurements too.
Execution Information Template
There is a data set that will be common to all the test execution done to a subject, such as:
- Test subject information
- OS Version
- Hardware description
And others that will be defined in the Execution Information Template
Proposal composed from the discussion in bug 9954:
<?xml version="1.0" ?> <metadata> <hostname>mqz-osx-suse64</hostname> <distro> <pretty_name>poky 2.2</pretty_name> <id>poky</id> <version_id>2.2</version_id> </distro> <host_distro> <id>opensuse</id> <version_id>42.1</version_id> <pretty_name>openSUSE Leap 42.1 (x86_64)</pretty_name> </host_distro> <layers> <layer name="meta"> <commit>c8bd57d7cf127b5ee315597408940870882a10e0</commit> <commit_count>34619</commit_count> <branch>feature/oeqa-metaxml</branch> </layer> <layer name="meta-selftest"> <commit>c8bd57d7cf127b5ee315597408940870882a10e0</commit> <commit_count>34619</commit_count> <branch>feature/oeqa-metaxml</branch> </layer> <layer name="meta-foobar"/> </layers> <bitbake> <commit>058f8517c041b80e8b591ad7d34a68281b2d03fc</commit> <commit_count>6577</commit_count> <branch>master</branch> </bitbake> <config> <variable name="BB_NUMBER_THREADS">2</variable> <variable name="MACHINE">qemux86</variable> <variable name="PARALLEL_MAKE">-j 2</variable> </config> </metadata>
Report Template
See the test report template here.
This template is WIP to be extended for the performance measurements.
QA Sample report
See a sample report of a QA test here.
Build Performance Sample Report
Proposed XML format for build performace reports. This follows the JUnit specification with the following extensions:
- test description attribute in <testcase>
- <sysres> and <diskusage> elements under <testcase>
<?xml version="1.0" encoding="utf-8"?> <testsuites> <testsuite errors="1" failures="1" hostname="marquiz" name="oeqa.buildperf" skipped="1" tests="6" time="22.347736" timestamp="2016-11-14T11:18:03.419477"> <testcase classname="oeqa.buildperf.test_basic.Test1P1" description="Measure wall clock of bitbake core-image-sato and size of tmp dir" name="test1" time="4.575692" timestamp="2016-11-14T13:18:03.419602"> <sysres legend="bitbake quilt-native" name="build"> <time>2.269468</time> <buildstats_file>test1/buildstats.bitbake-quilt-native.json</buildstats_file> <iostat cancelled_write_bytes="24576" rchar="33302047" read_bytes="0" syscr="53293" syscw="1717" wchar="4318553" write_bytes="327680"/> <rusage ru_inblock="0" ru_majflt="0" ru_maxrss="112232" ru_minflt="101886" ru_nivcsw="84" ru_nvcsw="9542" ru_oublock="640" ru_stime="0.16799999999999998" ru_utime="2.488"/> </sysres> <diskusage legend="tmpdir" name="tmpdir"> <size>32044</size> </diskusage> </testcase> <testcase classname="oeqa.buildperf.test_basic.Test1P2" description="Measure bitbake virtual/kernel" name="test12" time="4.430943" timestamp="2016-11-14T13:18:07.995385"> <sysres legend="bitbake quilt-native" name="build"> <time>2.219579</time> <iostat cancelled_write_bytes="24576" rchar="33302771" read_bytes="0" syscr="52540" syscw="1717" wchar="4319267" write_bytes="327680"/> <rusage ru_inblock="0" ru_majflt="0" ru_maxrss="112060" ru_minflt="101833" ru_nivcsw="80" ru_nvcsw="9478" ru_oublock="640" ru_stime="0.208" ru_utime="2.408"/> </sysres> </testcase> <testcase classname="oeqa.buildperf.test_basic.Test1P3" description="" name="test13" time="2.278677" timestamp="2016-11-14T13:18:12.426471"> <failure message="" type="AssertionError">Traceback (most recent call last): File "/home/marquiz/yocto/openembedded-core/meta/lib/oeqa/buildperf/test_basic.py", line 54, in test13 assert False AssertionError </failure> </testcase> ... </testsuite> </testsuites>