1.. SPDX-License-Identifier: CC-BY-SA-2.0-UK
2
3*****************************************
4The Yocto Project Test Environment Manual
5*****************************************
6
7Welcome
8=======
9
10Welcome to the Yocto Project Test Environment Manual! This manual is a
11work in progress. The manual contains information about the testing
12environment used by the Yocto Project to make sure each major and minor
13release works as intended. All the project's testing infrastructure and
14processes are publicly visible and available so that the community can
15see what testing is being performed, how it's being done and the current
16status of the tests and the project at any given time. It is intended
17that other organizations can leverage off the process and testing
18environment used by the Yocto Project to create their own automated,
19production test environment, building upon the foundations from the
20project core.
21
22This manual is a work-in-progress and is being initially loaded with
23information from the README files and notes from key engineers:
24
25-  *yocto-autobuilder2:* This
26   :yocto_git:`README.md </yocto-autobuilder2/tree/README.md>`
27   is the main README which details how to set up the Yocto Project
28   Autobuilder. The ``yocto-autobuilder2`` repository represents the
29   Yocto Project's console UI plugin to Buildbot and the configuration
30   necessary to configure Buildbot to perform the testing the project
31   requires.
32
33-  *yocto-autobuilder-helper:* This :yocto_git:`README </yocto-autobuilder-helper/tree/README/>`
34   and repository contains Yocto Project Autobuilder Helper scripts and
35   configuration. The ``yocto-autobuilder-helper`` repository contains
36   the "glue" logic that defines which tests to run and how to run them.
37   As a result, it can be used by any Continuous Improvement (CI) system
38   to run builds, support getting the correct code revisions, configure
39   builds and layers, run builds, and collect results. The code is
40   independent of any CI system, which means the code can work `Buildbot <https://docs.buildbot.net/current/>`__,
41   Jenkins, or others. This repository has a branch per release of the
42   project defining the tests to run on a per release basis.
43
44Yocto Project Autobuilder Overview
45==================================
46
47The Yocto Project Autobuilder collectively refers to the software,
48tools, scripts, and procedures used by the Yocto Project to test
49released software across supported hardware in an automated and regular
50fashion. Basically, during the development of a Yocto Project release,
51the Autobuilder tests if things work. The Autobuilder builds all test
52targets and runs all the tests.
53
54The Yocto Project uses now uses standard upstream
55Buildbot (`version 3.8 <https://docs.buildbot.net/3.8.0/>`__) to
56drive its integration and testing. Buildbot has a plug-in interface
57that the Yocto Project customizes using code from the
58``yocto-autobuilder2`` repository, adding its own console UI plugin. The
59resulting UI plug-in allows you to visualize builds in a way suited to
60the project's needs.
61
62A ``helper`` layer provides configuration and job management through
63scripts found in the ``yocto-autobuilder-helper`` repository. The
64``helper`` layer contains the bulk of the build configuration
65information and is release-specific, which makes it highly customizable
66on a per-project basis. The layer is CI system-agnostic and contains a
67number of Helper scripts that can generate build configurations from
68simple JSON files.
69
70.. note::
71
72   The project uses Buildbot for historical reasons but also because
73   many of the project developers have knowledge of Python. It is
74   possible to use the outer layers from another Continuous Integration
75   (CI) system such as :wikipedia:`Jenkins <Jenkins_(software)>`
76   instead of Buildbot.
77
78The following figure shows the Yocto Project Autobuilder stack with a
79topology that includes a controller and a cluster of workers:
80
81.. image:: figures/ab-test-cluster.png
82   :align: center
83   :width: 70%
84
85Yocto Project Tests --- Types of Testing Overview
86=================================================
87
88The Autobuilder tests different elements of the project by using
89the following types of tests:
90
91-  *Build Testing:* Tests whether specific configurations build by
92   varying :term:`MACHINE`,
93   :term:`DISTRO`, other configuration
94   options, and the specific target images being built (or ``world``). This is
95   used to trigger builds of all the different test configurations on the
96   Autobuilder. Builds usually cover many different targets for
97   different architectures, machines, and distributions, as well as
98   different configurations, such as different init systems. The
99   Autobuilder tests literally hundreds of configurations and targets.
100
101   -  *Sanity Checks During the Build Process:* Tests initiated through the
102      :ref:`ref-classes-insane` class. These checks ensure the output of the
103      builds are correct. For example, does the ELF architecture in the
104      generated binaries match the target system? ARM binaries would not work
105      in a MIPS system!
106
107-  *Build Performance Testing:* Tests whether or not commonly used steps
108   during builds work efficiently and avoid regressions. Tests to time
109   commonly used usage scenarios are run through ``oe-build-perf-test``.
110   These tests are run on isolated machines so that the time
111   measurements of the tests are accurate and no other processes
112   interfere with the timing results. The project currently tests
113   performance on two different distributions, Fedora and Ubuntu, to
114   ensure we have no single point of failure and can ensure the
115   different distros work effectively.
116
117-  *eSDK Testing:* Image tests initiated through the following command::
118
119      $ bitbake image -c testsdkext
120
121   The tests use the :ref:`ref-classes-testsdk` class and the
122   ``do_testsdkext`` task.
123
124-  *Feature Testing:* Various scenario-based tests are run through the
125   :ref:`OpenEmbedded Self test (oe-selftest) <ref-manual/release-process:Testing and Quality Assurance>`. We test oe-selftest on each of the main distributions
126   we support.
127
128-  *Image Testing:* Image tests initiated through the following command::
129
130      $ bitbake image -c testimage
131
132   The tests use the :ref:`ref-classes-testimage`
133   class and the :ref:`ref-tasks-testimage` task.
134
135-  *Layer Testing:* The Autobuilder has the possibility to test whether
136   specific layers work with the test of the system. The layers tested
137   may be selected by members of the project. Some key community layers
138   are also tested periodically.
139
140-  *Package Testing:* A Package Test (ptest) runs tests against packages
141   built by the OpenEmbedded build system on the target machine. See the
142   :ref:`Testing Packages With
143   ptest <dev-manual/packages:Testing Packages With ptest>` section
144   in the Yocto Project Development Tasks Manual and the
145   ":yocto_wiki:`Ptest </Ptest>`" Wiki page for more
146   information on Ptest.
147
148-  *SDK Testing:* Image tests initiated through the following command::
149
150      $ bitbake image -c testsdk
151
152   The tests use the :ref:`ref-classes-testsdk` class and
153   the ``do_testsdk`` task.
154
155-  *Unit Testing:* Unit tests on various components of the system run
156   through :ref:`bitbake-selftest <ref-manual/release-process:Testing and Quality Assurance>` and
157   :ref:`oe-selftest <ref-manual/release-process:Testing and Quality Assurance>`.
158
159-  *Automatic Upgrade Helper:* This target tests whether new versions of
160   software are available and whether we can automatically upgrade to
161   those new versions. If so, this target emails the maintainers with a
162   patch to let them know this is possible.
163
164How Tests Map to Areas of Code
165==============================
166
167Tests map into the codebase as follows:
168
169-  *bitbake-selftest:*
170
171   These tests are self-contained and test BitBake as well as its APIs,
172   which include the fetchers. The tests are located in
173   ``bitbake/lib/*/tests``.
174
175   Some of these tests run the ``bitbake`` command, so ``bitbake/bin``
176   must be added to the ``PATH`` before running ``bitbake-selftest``.
177   From within the BitBake repository, run the following::
178
179      $ export PATH=$PWD/bin:$PATH
180
181   After that, you can run the selftest script::
182
183      $ bitbake-selftest
184
185   The default output is quiet and just prints a summary of what was
186   run. To see more information, there is a verbose option::
187
188      $ bitbake-selftest -v
189
190   To skip tests that access the Internet, use the ``BB_SKIP_NETTESTS``
191   variable when running ``bitbake-selftest`` as follows::
192
193      $ BB_SKIP_NETTESTS=yes bitbake-selftest
194
195   Use this option when you wish to skip tests that access the network,
196   which are mostly necessary to test the fetcher modules. To specify
197   individual test modules to run, append the test module name to the
198   ``bitbake-selftest`` command. For example, to specify the tests for
199   ``bb.tests.data.DataExpansions``, run::
200
201      $ bitbake-selftest bb.tests.data.DataExpansions
202
203   You can also specify individual tests by defining the full name and module
204   plus the class path of the test, for example::
205
206      $ bitbake-selftest bb.tests.data.DataExpansions.test_one_var
207
208   The tests are based on
209   `Python unittest <https://docs.python.org/3/library/unittest.html>`__.
210
211-  *oe-selftest:*
212
213   -  These tests use OE to test the workflows, which include testing
214      specific features, behaviors of tasks, and API unit tests.
215
216   -  The tests can take advantage of parallelism through the ``-j``
217      option, which can specify a number of threads to spread the tests
218      across. Note that all tests from a given class of tests will run
219      in the same thread. To parallelize large numbers of tests you can
220      split the class into multiple units.
221
222   -  The tests are based on
223      `Python unittest <https://docs.python.org/3/library/unittest.html>`__.
224
225   -  The code for the tests resides in
226      ``meta/lib/oeqa/selftest/cases/``.
227
228   -  To run all the tests, enter the following command::
229
230         $ oe-selftest -a
231
232   -  To run a specific test, use the following command form where
233      ``testname`` is the name of the specific test::
234
235         $ oe-selftest -r <testname>
236
237      For example, the following command would run the ``tinfoil``
238      ``getVar`` API test::
239
240         $ oe-selftest -r tinfoil.TinfoilTests.test_getvar
241
242      It is also possible to run a set
243      of tests. For example the following command will run all of the
244      ``tinfoil`` tests::
245
246         $ oe-selftest -r tinfoil
247
248-  *testimage:*
249
250   -  These tests build an image, boot it, and run tests against the
251      image's content.
252
253   -  The code for these tests resides in ``meta/lib/oeqa/runtime/cases/``.
254
255   -  You need to set the :term:`IMAGE_CLASSES` variable as follows::
256
257         IMAGE_CLASSES += "testimage"
258
259   -  Run the tests using the following command form::
260
261         $ bitbake image -c testimage
262
263-  *testsdk:*
264
265   -  These tests build an SDK, install it, and then run tests against
266      that SDK.
267
268   -  The code for these tests resides in ``meta/lib/oeqa/sdk/cases/``.
269
270   -  Run the test using the following command form::
271
272         $ bitbake image -c testsdk
273
274-  *testsdk_ext:*
275
276   -  These tests build an extended SDK (eSDK), install that eSDK, and
277      run tests against the eSDK.
278
279   -  The code for these tests resides in ``meta/lib/oeqa/sdkext/cases/``.
280
281   -  To run the tests, use the following command form::
282
283         $ bitbake image -c testsdkext
284
285-  *oe-build-perf-test:*
286
287   -  These tests run through commonly used usage scenarios and measure
288      the performance times.
289
290   -  The code for these tests resides in ``meta/lib/oeqa/buildperf``.
291
292   -  To run the tests, use the following command form::
293
294         $ oe-build-perf-test <options>
295
296      The command takes a number of options,
297      such as where to place the test results. The Autobuilder Helper
298      Scripts include the ``build-perf-test-wrapper`` script with
299      examples of how to use the oe-build-perf-test from the command
300      line.
301
302      Use the ``oe-git-archive`` command to store test results into a
303      Git repository.
304
305      Use the ``oe-build-perf-report`` command to generate text reports
306      and HTML reports with graphs of the performance data. See
307      :yocto_dl:`html </releases/yocto/yocto-4.3/testresults/buildperf-debian11/perf-debian11_nanbield_20231019191258_15b576c410.html>`
308      and
309      :yocto_dl:`txt </releases/yocto/yocto-4.3/testresults/buildperf-debian11/perf-debian11_nanbield_20231019191258_15b576c410.txt>`
310      examples.
311
312   -  The tests are contained in ``meta/lib/oeqa/buildperf/test_basic.py``.
313
314Test Examples
315=============
316
317This section provides example tests for each of the tests listed in the
318:ref:`test-manual/intro:How Tests Map to Areas of Code` section.
319
320-  ``oe-selftest`` testcases reside in the ``meta/lib/oeqa/selftest/cases`` directory.
321
322-  ``bitbake-selftest`` testcases reside in the ``bitbake/lib/bb/tests/`` directory.
323
324``bitbake-selftest``
325--------------------
326
327A simple test example from ``bitbake/lib/bb/tests/data.py`` is::
328
329   class DataExpansions(unittest.TestCase):
330      def setUp(self):
331            self.d = bb.data.init()
332            self.d["foo"] = "value_of_foo"
333            self.d["bar"] = "value_of_bar"
334            self.d["value_of_foo"] = "value_of_'value_of_foo'"
335
336      def test_one_var(self):
337            val = self.d.expand("${foo}")
338            self.assertEqual(str(val), "value_of_foo")
339
340In this example, a ``DataExpansions`` class of tests is created, derived from
341standard `Python unittest <https://docs.python.org/3/library/unittest.html>`__.
342The class has a common ``setUp`` function which is shared by all the tests in
343the class. A simple test is then added to test that when a variable is
344expanded, the correct value is found.
345
346BitBake selftests are straightforward
347`Python unittest <https://docs.python.org/3/library/unittest.html>`__.
348Refer to the `Python unittest documentation
349<https://docs.python.org/3/library/unittest.html>`__ for additional information
350on writing such tests.
351
352``oe-selftest``
353---------------
354
355These tests are more complex due to the setup required behind the scenes
356for full builds. Rather than directly using `Python unittest
357<https://docs.python.org/3/library/unittest.html>`__, the code
358wraps most of the standard objects. The tests can be simple, such as
359testing a command from within the OE build environment using the
360following example::
361
362   class BitbakeLayers(OESelftestTestCase):
363      def test_bitbakelayers_showcrossdepends(self):
364            result = runCmd('bitbake-layers show-cross-depends')
365            self.assertTrue('aspell' in result.output, msg = "No dependencies were shown. bitbake-layers show-cross-depends output: %s"% result.output)
366
367This example, taken from ``meta/lib/oeqa/selftest/cases/bblayers.py``,
368creates a testcase from the ``OESelftestTestCase`` class, derived
369from ``unittest.TestCase``, which runs the ``bitbake-layers`` command
370and checks the output to ensure it contains something we know should be
371here.
372
373The ``oeqa.utils.commands`` module contains Helpers which can assist
374with common tasks, including:
375
376-  *Obtaining the value of a bitbake variable:* Use
377   ``oeqa.utils.commands.get_bb_var()`` or use
378   ``oeqa.utils.commands.get_bb_vars()`` for more than one variable
379
380-  *Running a bitbake invocation for a build:* Use
381   ``oeqa.utils.commands.bitbake()``
382
383-  *Running a command:* Use ``oeqa.utils.commandsrunCmd()``
384
385There is also a ``oeqa.utils.commands.runqemu()`` function for launching
386the ``runqemu`` command for testing things within a running, virtualized
387image.
388
389You can run these tests in parallel. Parallelism works per test class,
390so tests within a given test class should always run in the same build,
391while tests in different classes or modules may be split into different
392builds. There is no data store available for these tests since the tests
393launch the ``bitbake`` command and exist outside of its context. As a
394result, common BitBake library functions (``bb.\*``) are also unavailable.
395
396``testimage``
397-------------
398
399These tests are run once an image is up and running, either on target
400hardware or under QEMU. As a result, they are assumed to be running in a
401target image environment, as opposed to in a host build environment. A
402simple example from ``meta/lib/oeqa/runtime/cases/python.py`` contains
403the following::
404
405   class PythonTest(OERuntimeTestCase):
406      @OETestDepends(['ssh.SSHTest.test_ssh'])
407      @OEHasPackage(['python3-core'])
408      def test_python3(self):
409         cmd = "python3 -c \\"import codecs; print(codecs.encode('Uryyb, jbeyq', 'rot13'))\""
410         status, output = self.target.run(cmd)
411         msg = 'Exit status was not 0. Output: %s' % output
412         self.assertEqual(status, 0, msg=msg)
413
414In this example, the ``OERuntimeTestCase`` class wraps
415``unittest.TestCase``. Within the test, ``self.target`` represents the
416target system, where commands can be run using the ``run()``
417method.
418
419To ensure certain tests or package dependencies are met, you can use the
420``OETestDepends`` and ``OEHasPackage`` decorators. For example, the test
421in this example would only make sense if ``python3-core`` is installed in
422the image.
423
424``testsdk_ext``
425---------------
426
427These tests are run against built extensible SDKs (eSDKs). The tests can
428assume that the eSDK environment has already been set up. An example from
429``meta/lib/oeqa/sdk/cases/devtool.py`` contains the following::
430
431   class DevtoolTest(OESDKExtTestCase):
432      @classmethod def setUpClass(cls):
433         myapp_src = os.path.join(cls.tc.esdk_files_dir, "myapp")
434         cls.myapp_dst = os.path.join(cls.tc.sdk_dir, "myapp")
435         shutil.copytree(myapp_src, cls.myapp_dst)
436         subprocess.check_output(['git', 'init', '.'], cwd=cls.myapp_dst)
437         subprocess.check_output(['git', 'add', '.'], cwd=cls.myapp_dst)
438         subprocess.check_output(['git', 'commit', '-m', "'test commit'"], cwd=cls.myapp_dst)
439
440      @classmethod
441      def tearDownClass(cls):
442         shutil.rmtree(cls.myapp_dst)
443      def _test_devtool_build(self, directory):
444         self._run('devtool add myapp %s' % directory)
445         try:
446         self._run('devtool build myapp')
447         finally:
448         self._run('devtool reset myapp')
449      def test_devtool_build_make(self):
450         self._test_devtool_build(self.myapp_dst)
451
452In this example, the ``devtool``
453command is tested to see whether a sample application can be built with
454the ``devtool build`` command within the eSDK.
455
456``testsdk``
457-----------
458
459These tests are run against built SDKs. The tests can assume that an SDK
460has already been extracted and its environment file has been sourced. A
461simple example from ``meta/lib/oeqa/sdk/cases/python2.py`` contains the
462following::
463
464   class Python3Test(OESDKTestCase):
465      def setUp(self):
466            if not (self.tc.hasHostPackage("nativesdk-python3-core") or
467                  self.tc.hasHostPackage("python3-core-native")):
468               raise unittest.SkipTest("No python3 package in the SDK")
469
470      def test_python3(self):
471            cmd = "python3 -c \\"import codecs; print(codecs.encode('Uryyb, jbeyq', 'rot13'))\""
472            output = self._run(cmd)
473            self.assertEqual(output, "Hello, world\n")
474
475In this example, if ``nativesdk-python3-core`` has been installed into the SDK,
476the code runs the ``python3`` interpreter with a basic command to check it is
477working correctly. The test would only run if Python3 is installed in the SDK.
478
479``oe-build-perf-test``
480----------------------
481
482The performance tests usually measure how long operations take and the
483resource utilization as that happens. An example from
484``meta/lib/oeqa/buildperf/test_basic.py`` contains the following::
485
486   class Test3(BuildPerfTestCase):
487      def test3(self):
488            """Bitbake parsing (bitbake -p)"""
489            # Drop all caches and parse
490            self.rm_cache()
491            oe.path.remove(os.path.join(self.bb_vars['TMPDIR'], 'cache'), True)
492            self.measure_cmd_resources(['bitbake', '-p'], 'parse_1',
493                     'bitbake -p (no caches)')
494            # Drop tmp/cache
495            oe.path.remove(os.path.join(self.bb_vars['TMPDIR'], 'cache'), True)
496            self.measure_cmd_resources(['bitbake', '-p'], 'parse_2',
497                     'bitbake -p (no tmp/cache)')
498            # Parse with fully cached data
499            self.measure_cmd_resources(['bitbake', '-p'], 'parse_3',
500                     'bitbake -p (cached)')
501
502This example shows how three specific parsing timings are
503measured, with and without various caches, to show how BitBake's parsing
504performance trends over time.
505
506Considerations When Writing Tests
507=================================
508
509When writing good tests, there are several things to keep in mind. Since
510things running on the Autobuilder are accessed concurrently by multiple
511workers, consider the following:
512
513**Running "cleanall" is not permitted.**
514
515This can delete files from :term:`DL_DIR` which would potentially break other
516builds running in parallel. If this is required, :term:`DL_DIR` must be set to
517an isolated directory.
518
519**Running "cleansstate" is not permitted.**
520
521This can delete files from :term:`SSTATE_DIR` which would potentially break
522other builds running in parallel. If this is required, :term:`SSTATE_DIR` must
523be set to an isolated directory. Alternatively, you can use the ``-f``
524option with the ``bitbake`` command to "taint" tasks by changing the
525sstate checksums to ensure sstate cache items will not be reused.
526
527**Tests should not change the metadata.**
528
529This is particularly true for oe-selftests since these can run in
530parallel and changing metadata leads to changing checksums, which
531confuses BitBake while running in parallel. If this is necessary, copy
532layers to a temporary location and modify them. Some tests need to
533change metadata, such as the devtool tests. To protect the metadata from
534changes, set up temporary copies of that data first.
535