FrontPage 

Fuego wiki

Login or create account

Tims LAVA Notes in split format

Here are some notes about the LAVA system:
Here are some notes about the LAVA system:
(need to add more of my notes from Linaro connect 2016 (Las Vegas) here)
(need to add more of my notes from Linaro connect 2016 (Las Vegas) here)

target dictionary [edit section]

= target dictionary =
LAVA V2 has something called a "target dictionary", which hold values
for a board.
The LAVA developers have a model that allows for multiple layers of values to be held, and inheritance and overrides to be used for the variables used for actual testing.
The LAVA developers have a model that allows for multiple layers of
values to be held, and inheritance and overrides to be used for the
variables used for actual testing.
They appear to put different types of variables into different format files, throughout the system, and use a database to query and manage the variables.
They appear to put different types of variables into different format
files, throughout the system, and use a database to query and manage
the variables.

test materials [edit section]

= test materials =
LAVA does not, itself, serve as a repository for test materials.
As a test framework, it "knows" how to perform actions on boards, but it doesn't actually have the tests in its own repository. The test to run is left as an exercise for the user.
As a test framework, it "knows" how to perform actions on boards,
but it doesn't actually have the tests in its own repository.
The test to run is left as an exercise for the user.
Lava provides a framework for writing tests, that push data back to a results database. Tests use a set of utility functions, on the DUT, to set the result of a test, attach a file to a test, start processes to generate additional test artifacts (monitors and other logs).
Lava provides a framework for writing tests, that push data back to a
results database.  Tests use a set of utility functions, on the DUT,
to set the result of a test, attach a file to a test, start processes
to generate additional test artifacts (monitors and other logs).
The utility functions are: * lava-test-case * used to report a testcase result or measurement * used to run a program whose exit code indicates pass/fail * lava-test-case-attach * lava-test-run-attach * lava-background-process-start * lava-background-process-stop
The utility functions are:
 * lava-test-case
    * used to report a testcase result or measurement
    * used to run a program whose exit code indicates pass/fail
 * lava-test-case-attach
 * lava-test-run-attach
 * lava-background-process-start
 * lava-background-process-stop
"Advanced parsing" is also available to convert log output from another program into results and measurements that can be used by LAVA. However, with V2 of LAVA, it is recommended that log parsing be done on the client with a custom script that calls lava-test-case to report results.
"Advanced parsing" is also available to convert log output from another
program into results and measurements that can be used by LAVA.
However, with V2 of LAVA, it is recommended that log parsing be done
on the client with a custom script that calls lava-test-case to report
results.
See https://lava.coreboot.org/static/docs/v2/lava_test_shell.html and https://lava.coreboot.org/static/docs/v2/writing-tests.html
See https://lava.coreboot.org/static/docs/v2/lava_test_shell.html
and https://lava.coreboot.org/static/docs/v2/writing-tests.html

security [edit section]

= security =
There is a security model which allows for tight control of test operations.
IMHO, this should be left as an exercise for Jenkins.
TBWiki engine 1.8.3 by Tim Bird