Ptest: Difference between revisions

From Yocto Project
Jump to navigationJump to search
No edit summary
 
(54 intermediate revisions by 11 users not shown)
Line 1: Line 1:
=Introduction=
=Introduction=


Ptest (''package test'') is a concept for building, installing and running the test suites that are included in many packages, and producing a consistent output format.
Ptest (''package test'') is a concept for building, installing and running the test suites that are included in many upstream packages, and producing a consistent output format for the results.


=Adding ptest to your build=
=Adding ptest to your build=


Ptest is enabled in your build by adding "ptest" to the DISTRO_FEATURES variable and "ptest-pkgs" to the IMAGE_FEATURES variable in your image (or EXTRA_IMAGE_FEATURES in your local.conf). This will cause ptest-enabled packages to build and install the test suite in /usr/lib/<package>/ptest.
To add package testing to your build, set the <code>DISTRO_FEATURES</code> and <code>EXTRA_IMAGE_FEATURES</code> variables in your <code>conf/local.conf</code> file, which is found in the Build Directory:
 
DISTRO_FEATURES:append = " ptest"
or
DISTRO_FEATURES_append = " ptest"
# Adding "ptest" to your DISTRO_FEATURES variable causes -ptest packages to be built.
 
EXTRA_IMAGE_FEATURES += "ptest-pkgs"
# Adding "ptest-pkgs" to IMAGE_FEATURES variable causes all -ptest packages corresponding with other packages installed in your image to be additionally installed.
 
As an alternative to the <code>ptest-pkgs</code> image feature, if you want to be more selective about which tests to include, you can install them individually e.g. to install the tests for e2fsprogs and zlib you would do something like this:
 
CORE_IMAGE_EXTRA_INSTALL += "e2fsprogs-ptest zlib-ptest"
 
This is of course just one way to add packages to an image, see [http://www.yoctoproject.org/docs/current/dev-manual/dev-manual.html#usingpoky-extend-customimage Customizing images] section of the Yocto Project development manual for more information.
 
All ptest files are installed in <code>/usr/lib/<package>/ptest</code>.


=Running ptest=
=Running ptest=


The "ptest-runner" package installs a "ptest-runner" shell script which loops through all installed ptest test suites and runs them in sequence.
The "ptest-runner" package installs a "ptest-runner" which loops through all installed ptest test suites and runs them in sequence. You may therefore want to add "ptest-runner" to your image.
 
==ptest-runner Usage==
 
The usage for the ptest-runner is as follows:
 
Usage: ptest-runner [-d directory] [-l list] [-t timeout] [-h] [ptest1 ptest2 ...]
 
==Executing ptest-runner==


=Adding ptest support to a package=
To execute a full ptest run you can do the following command line anywhere in the target:
 
$ ptest-runner
 
Remember to redirect to a file if you want to have a log
 
=Implementing ptest in a package recipe=


==What constitutes a ptest?==
==What constitutes a ptest?==


A ptest must at minimum contain two things: run-ptest and the actual test.
A ptest must at minimum contain two things: <code>run-ptest</code> and the actual test.


run-ptest is a minimal shell script that ''starts'' the test suite. Note: It must not ''contain'' the test suite, only start it!
<code>run-ptest</code> is a minimal shell script that ''starts'' the test suite. Note: It must not ''contain'' the test suite, only start it!


The test can be anything, from a simple shell script running a binary and checking its output to an elaborate system of test binaries and data files.
The test can be anything, from a simple shell script running a binary and checking its output to an elaborate system of test binaries and data files.
Line 23: Line 53:
One major point of ptest is to consolidate the output format of all tests into a single common format. The format selected is the automake "simple test" format:
One major point of ptest is to consolidate the output format of all tests into a single common format. The format selected is the automake "simple test" format:


  result: testname
result: testname


Where "result" is one of PASS, FAIL or SKIP and "testname" can be any identifying string. (The same format [http://www.gnu.org/software/automake/manual/automake.html#Simple-Tests used by Automake].)
Where "result" is one of PASS, FAIL or SKIP and "testname" can be any identifying string (the same format [http://www.gnu.org/software/automake/manual/automake.html#Simple-Tests used by Automake].)


==General recipe preparations==
==General recipe preparations==


First, add "ptest" to the "inherit" line in the package recipe.
First, add <code>ptest</code> to the <code>inherit</code> line in the recipe.


If a test adds build-time or run-time dependencies to the package which are not there normally (such as requiring "make" to run the test suite), add those with a -ptest suffix, like this:
If a test adds build-time or run-time dependencies to the package which are not there normally (such as requiring "make" to run the test suite), add those with a -ptest suffix, like this:


  RDEPENDS_${PN}-ptest += "make"
RDEPENDS:${PN}-ptest += "make"


==Building the test suite==
==Building the test suite==
Line 39: Line 69:
Few packages support cross-compiling their test suites, so this is something we typically have to add.
Few packages support cross-compiling their test suites, so this is something we typically have to add.


Many automake-based packages compile and run the test suite in a single command: "make check". This doesn't work when cross-compiling (since we need to build on host and run on target), so we need to split that into two targets: One for building the test and one for running it. Our build of automake comes with a patch which does this, so packages using the plain "make check" arrangement from automake get this automatically.
Many automake-based packages compile and run the test suite in a single command: "make check". This doesn't work when cross-compiling, since we need to build on host and run on target. So we need to split that into two targets: One for building the test and one for running it. Our build of automake comes with a [http://git.yoctoproject.org/cgit/cgit.cgi/poky/tree/meta/recipes-devtools/automake/automake/buildtest.patch patch] which does this automatically, so packages using the plain "make check" arrangement from automake get this automatically.


Now add a do_compile_append function to build the test suite:
Now add a do_compile_ptest function to build the test suite:


  do_compile_append() {
do_compile_ptest() {
      if [ "${PN}" = "${BPN}" -a ${PTEST_ENABLED} = "1" ]; then
    oe_runmake buildtest-TESTS
        oe_runmake buildtest-TESTS
}
      fi
 
  }
If the package requires special configuration actions prior to compiling the test code, create a do_configure_ptest function to do that.
 
Note that buildtest-TESTS requires [https://www.gnu.org/software/automake/manual/html_node/Serial-Test-Harness.html  serial-tests], but usually parallel tests are enabled by default.
In this case, you may have to modify the configure.ac script by adding serial-tests to AM_INIT_AUTOMAKE:
 
AM_INIT_AUTOMAKE([serial-tests])
 
Recursive targets may not work out of the box either, in that case also try adding the following into configure.ac:
 
AM_EXTRA_RECURSIVE_TARGETS([buildtest-TESTS])


==Installing the test suite==
==Installing the test suite==


The ptest.bbclass contains a ptest_do_install function which copies the required "run-ptest" file and runs "make install-ptest" if there is such a target in the top-level Makefile. This provides a standardized install method to use in the package do_install_append function:
The <code>ptest</code> class will automatically copy the required <code>run-ptest</code> file and run <code>make install-ptest</code> if there is such a target in the top-level <code>Makefile</code>. For packages where this is enough, you don't need to do anything else.
 
If you need to do something more than the automatic <code>make install-ptest</code>, create a <code>do_install_ptest</code> function within the recipe and put the required actions there. This function will be called after <code>make install-ptest</code> has completed.
 
==Recipes with ptest Enabled==
 
You can use the layer index search function to find out which recipes have ptest enabled by searching for recipes that inherit the <code>ptest</code> class:
 
http://layers.openembedded.org/layerindex/branch/master/recipes/?q=inherits%3Aptest
 
Specifically for OE-Core:
 
http://layers.openembedded.org/layerindex/branch/master/recipes/?q=inherits%3Aptest+layer%3Aopenembedded-core
 
Please consider when updating a recipe whether you can also incorporate a ptest package to it as well.
 
= QA =
 
For every Milestone of the release, QA team should execute pTest on real HW and compare the results with his previous corresponding execution (example 2.3 M2 should be compared with 2.3 M1 results) to track better all the releases ans execution an [[ Ptest/archive | archive ]] page was created to track all the executions and check what are the previous results available to compare.


  do_install_append() {
* To check the process used to execute Ptest during QA cycles go to [[BSP_Test_Plan#pTest|pTest process]]
      ptest_do_install
  }

Latest revision as of 14:36, 25 December 2023

Introduction

Ptest (package test) is a concept for building, installing and running the test suites that are included in many upstream packages, and producing a consistent output format for the results.

Adding ptest to your build

To add package testing to your build, set the DISTRO_FEATURES and EXTRA_IMAGE_FEATURES variables in your conf/local.conf file, which is found in the Build Directory:

DISTRO_FEATURES:append = " ptest"
or
DISTRO_FEATURES_append = " ptest"
# Adding "ptest" to your DISTRO_FEATURES variable causes -ptest packages to be built.
EXTRA_IMAGE_FEATURES += "ptest-pkgs"
# Adding "ptest-pkgs" to IMAGE_FEATURES variable causes all -ptest packages corresponding with other packages installed in your image to be additionally installed.

As an alternative to the ptest-pkgs image feature, if you want to be more selective about which tests to include, you can install them individually e.g. to install the tests for e2fsprogs and zlib you would do something like this:

CORE_IMAGE_EXTRA_INSTALL += "e2fsprogs-ptest zlib-ptest"

This is of course just one way to add packages to an image, see Customizing images section of the Yocto Project development manual for more information.

All ptest files are installed in /usr/lib/<package>/ptest.

Running ptest

The "ptest-runner" package installs a "ptest-runner" which loops through all installed ptest test suites and runs them in sequence. You may therefore want to add "ptest-runner" to your image.

ptest-runner Usage

The usage for the ptest-runner is as follows:

Usage: ptest-runner [-d directory] [-l list] [-t timeout] [-h] [ptest1 ptest2 ...]

Executing ptest-runner

To execute a full ptest run you can do the following command line anywhere in the target:

$ ptest-runner

Remember to redirect to a file if you want to have a log

Implementing ptest in a package recipe

What constitutes a ptest?

A ptest must at minimum contain two things: run-ptest and the actual test.

run-ptest is a minimal shell script that starts the test suite. Note: It must not contain the test suite, only start it!

The test can be anything, from a simple shell script running a binary and checking its output to an elaborate system of test binaries and data files.

One major point of ptest is to consolidate the output format of all tests into a single common format. The format selected is the automake "simple test" format:

result: testname

Where "result" is one of PASS, FAIL or SKIP and "testname" can be any identifying string (the same format used by Automake.)

General recipe preparations

First, add ptest to the inherit line in the recipe.

If a test adds build-time or run-time dependencies to the package which are not there normally (such as requiring "make" to run the test suite), add those with a -ptest suffix, like this:

RDEPENDS:${PN}-ptest += "make"

Building the test suite

Few packages support cross-compiling their test suites, so this is something we typically have to add.

Many automake-based packages compile and run the test suite in a single command: "make check". This doesn't work when cross-compiling, since we need to build on host and run on target. So we need to split that into two targets: One for building the test and one for running it. Our build of automake comes with a patch which does this automatically, so packages using the plain "make check" arrangement from automake get this automatically.

Now add a do_compile_ptest function to build the test suite:

do_compile_ptest() {
    oe_runmake buildtest-TESTS
}

If the package requires special configuration actions prior to compiling the test code, create a do_configure_ptest function to do that.

Note that buildtest-TESTS requires serial-tests, but usually parallel tests are enabled by default. In this case, you may have to modify the configure.ac script by adding serial-tests to AM_INIT_AUTOMAKE:

AM_INIT_AUTOMAKE([serial-tests])

Recursive targets may not work out of the box either, in that case also try adding the following into configure.ac:

AM_EXTRA_RECURSIVE_TARGETS([buildtest-TESTS])

Installing the test suite

The ptest class will automatically copy the required run-ptest file and run make install-ptest if there is such a target in the top-level Makefile. For packages where this is enough, you don't need to do anything else.

If you need to do something more than the automatic make install-ptest, create a do_install_ptest function within the recipe and put the required actions there. This function will be called after make install-ptest has completed.

Recipes with ptest Enabled

You can use the layer index search function to find out which recipes have ptest enabled by searching for recipes that inherit the ptest class:

http://layers.openembedded.org/layerindex/branch/master/recipes/?q=inherits%3Aptest

Specifically for OE-Core:

http://layers.openembedded.org/layerindex/branch/master/recipes/?q=inherits%3Aptest+layer%3Aopenembedded-core

Please consider when updating a recipe whether you can also incorporate a ptest package to it as well.

QA

For every Milestone of the release, QA team should execute pTest on real HW and compare the results with his previous corresponding execution (example 2.3 M2 should be compared with 2.3 M1 results) to track better all the releases ans execution an archive page was created to track all the executions and check what are the previous results available to compare.

  • To check the process used to execute Ptest during QA cycles go to pTest process