Issue 0026
- Summary
- add test dependencies
- Owner
- Tim
- Reporter
- Tim, AGL, Seimens
- Status
- in progress
- Priority
- very high
Description [edit section]
Dependencies are important to avoid running tests that have problems.This feature allows the test creator to specify a set of dependencies that are checked before a test is invoked. It is intended that this be used for high-level sanity-checks - that is, to specify resources or configurations that, if missing, invalidate the entire test. If these are missing it makes it impossible to run the test.
Some tests check for features themselves, and abort the test or skip sub-tests if the features cannot be found. Individual tests use their own mechanisms for testing for these types of conditions.
These are checks intended to be used by the base script to validate whether a particular test even applies to a certain target (or target configuration).
The dependency checks are processed during the pre_test phase, and if the dependencies are not met, the test is aborted with a failure.
Existing support [edit section]
Fuego already supports the following mechanisms for detecting dependencies:- check_capability - check for the presence of a CAP_FEATURE environment variable
- this is used to check free-form variables (defined in the board or distro files)
- assert_define - check for the presence of an environment variable
- this is used to check for test_spec variables
- is_on_target - check for the presence of a file on the target system
Types of dependencies [edit section]
Here are different types of dependencies that may be present:- dependency on a specific hardware feature (such as touchpad, video, audio)
- dependency on a system capability (such as networking, file system, user accounts, etc.)
- dependency on a particular kernel configuration
- these are often used as proxies for system capabilities or hardware
- dependency on a file or executable on the target
Outcomes [edit section]
Should dependency failures result in test failures, or in some other failure result - like "test skipped"? Is this a particular type of failure that should be reported differently from a regular test failure?
Caching dependency information [edit section]
It may save time to be able to cache the dependency information about a particular target board, if detecting the information is time-consuming. For example, the
Industry survey [edit section]
How to other systems allow for checking for test dependencies?- what does 0-day do for test dependencies?
- uses 'need_xxx' directives (which are declarative)
- the following are supported:
- need_kconfig
- examples:
- need_kconfig: CONFIG_
- examples:
- need_x
- need_kernel_headers
- example: need_kernel_headers: true
- need_memory
- example: need_memory 2G
- need_cpu
- number of cpus, not used
- need_kconfig
- what does fuego do? (imperative)
- what does kernelci
- what does LTP do? *
proposal [edit section]
- declare dependencies as environment variables:
- need_memory=2G
- need_kconfig=CONFIG_MD_RAID ...
- space-separated
- need_env_vars=
- need_kernel_headers=1
Notes [edit section]
This was originally suggested in July of 2016 (probably in discussions at LinuxCon Japan).
- backlink
Fuego Issues List, Fuego To Do List