Ptest: Difference between revisions

From Yocto Project
Jump to navigationJump to search
 
(One intermediate revision by one other user not shown)
Line 8: Line 8:


  DISTRO_FEATURES:append = " ptest"
  DISTRO_FEATURES:append = " ptest"
or
or
  DISTRO_FEATURES_append = " ptest"
  DISTRO_FEATURES_append = " ptest"
  # Adding "ptest" to your DISTRO_FEATURES variable causes -ptest packages to be built.
  # Adding "ptest" to your DISTRO_FEATURES variable causes -ptest packages to be built.
Line 63: Line 63:
If a test adds build-time or run-time dependencies to the package which are not there normally (such as requiring "make" to run the test suite), add those with a -ptest suffix, like this:
If a test adds build-time or run-time dependencies to the package which are not there normally (such as requiring "make" to run the test suite), add those with a -ptest suffix, like this:


  RDEPENDS_${PN}-ptest += "make"
  RDEPENDS:${PN}-ptest += "make"


==Building the test suite==
==Building the test suite==

Latest revision as of 14:36, 25 December 2023

Introduction

Ptest (package test) is a concept for building, installing and running the test suites that are included in many upstream packages, and producing a consistent output format for the results.

Adding ptest to your build

To add package testing to your build, set the DISTRO_FEATURES and EXTRA_IMAGE_FEATURES variables in your conf/local.conf file, which is found in the Build Directory:

DISTRO_FEATURES:append = " ptest"
or
DISTRO_FEATURES_append = " ptest"
# Adding "ptest" to your DISTRO_FEATURES variable causes -ptest packages to be built.
EXTRA_IMAGE_FEATURES += "ptest-pkgs"
# Adding "ptest-pkgs" to IMAGE_FEATURES variable causes all -ptest packages corresponding with other packages installed in your image to be additionally installed.

As an alternative to the ptest-pkgs image feature, if you want to be more selective about which tests to include, you can install them individually e.g. to install the tests for e2fsprogs and zlib you would do something like this:

CORE_IMAGE_EXTRA_INSTALL += "e2fsprogs-ptest zlib-ptest"

This is of course just one way to add packages to an image, see Customizing images section of the Yocto Project development manual for more information.

All ptest files are installed in /usr/lib/<package>/ptest.

Running ptest

The "ptest-runner" package installs a "ptest-runner" which loops through all installed ptest test suites and runs them in sequence. You may therefore want to add "ptest-runner" to your image.

ptest-runner Usage

The usage for the ptest-runner is as follows:

Usage: ptest-runner [-d directory] [-l list] [-t timeout] [-h] [ptest1 ptest2 ...]

Executing ptest-runner

To execute a full ptest run you can do the following command line anywhere in the target:

$ ptest-runner

Remember to redirect to a file if you want to have a log

Implementing ptest in a package recipe

What constitutes a ptest?

A ptest must at minimum contain two things: run-ptest and the actual test.

run-ptest is a minimal shell script that starts the test suite. Note: It must not contain the test suite, only start it!

The test can be anything, from a simple shell script running a binary and checking its output to an elaborate system of test binaries and data files.

One major point of ptest is to consolidate the output format of all tests into a single common format. The format selected is the automake "simple test" format:

result: testname

Where "result" is one of PASS, FAIL or SKIP and "testname" can be any identifying string (the same format used by Automake.)

General recipe preparations

First, add ptest to the inherit line in the recipe.

If a test adds build-time or run-time dependencies to the package which are not there normally (such as requiring "make" to run the test suite), add those with a -ptest suffix, like this:

RDEPENDS:${PN}-ptest += "make"

Building the test suite

Few packages support cross-compiling their test suites, so this is something we typically have to add.

Many automake-based packages compile and run the test suite in a single command: "make check". This doesn't work when cross-compiling, since we need to build on host and run on target. So we need to split that into two targets: One for building the test and one for running it. Our build of automake comes with a patch which does this automatically, so packages using the plain "make check" arrangement from automake get this automatically.

Now add a do_compile_ptest function to build the test suite:

do_compile_ptest() {
    oe_runmake buildtest-TESTS
}

If the package requires special configuration actions prior to compiling the test code, create a do_configure_ptest function to do that.

Note that buildtest-TESTS requires serial-tests, but usually parallel tests are enabled by default. In this case, you may have to modify the configure.ac script by adding serial-tests to AM_INIT_AUTOMAKE:

AM_INIT_AUTOMAKE([serial-tests])

Recursive targets may not work out of the box either, in that case also try adding the following into configure.ac:

AM_EXTRA_RECURSIVE_TARGETS([buildtest-TESTS])

Installing the test suite

The ptest class will automatically copy the required run-ptest file and run make install-ptest if there is such a target in the top-level Makefile. For packages where this is enough, you don't need to do anything else.

If you need to do something more than the automatic make install-ptest, create a do_install_ptest function within the recipe and put the required actions there. This function will be called after make install-ptest has completed.

Recipes with ptest Enabled

You can use the layer index search function to find out which recipes have ptest enabled by searching for recipes that inherit the ptest class:

http://layers.openembedded.org/layerindex/branch/master/recipes/?q=inherits%3Aptest

Specifically for OE-Core:

http://layers.openembedded.org/layerindex/branch/master/recipes/?q=inherits%3Aptest+layer%3Aopenembedded-core

Please consider when updating a recipe whether you can also incorporate a ptest package to it as well.

QA

For every Milestone of the release, QA team should execute pTest on real HW and compare the results with his previous corresponding execution (example 2.3 M2 should be compared with 2.3 M1 results) to track better all the releases ans execution an archive page was created to track all the executions and check what are the previous results available to compare.

  • To check the process used to execute Ptest during QA cycles go to pTest process