Fuego_release_1.0.9-notes 

Fuego 1.0 wiki

Login or create account

Fuego release 1.0.9-notes

Here is a list of problems encountered during the 1.0.9 release testing:

Problems [edit section]

general [edit section]

  • /fuego-core/engine/scripts/ftc-unit-test.sh
    • can't download clitest
    • uses /userdata instead of /fuego-ro for accessing board stuff
  • had to create testplan_fuego (for fuego-specific tests)

Functional.fuego_board_check [edit section]

  • Jenkins jobs have the wrong TESTNAME
    • e.g. docker.testplan_fuego.Functional.fuego_board_check has "TESTNAME=fuego_board_check" instead of 'fuego_test')
      • fuego-create-jobs doesn't read test.yaml file
    • see if fuego-create-jobs uses the test name for the base script name
  • (worked around) can't find test with default plan in spec list
    • code is missing my patches for ignoring 'testplan_default'
    • you get an error from ovgen.py
  • docker board is missing reboot, according to Functional.fuego_board_check
    • scan_form_items.sh reports HAS_REBOOT=0
      • but docker has /sbin/reboot
    • test is run as user 'jenkins' which doesn't have /sbin in the path
    • Functional.fuego_board_check is correct!

Benchmark.fuego_check_plots [edit section]

  • (fixed) parser.py gets an exception
    • it's still using FUEGO_PARSER_PATH, instead of FUEGO_CORE environment var.
  • plot.png is empty
    • missing metrics.json file
    • multiplot mapping requires a specific metric name patterns (leading item of same name, to chart the data in a single chart?
      • plotting is VERY fragile!!
        • no one can debug this stuff
        • matplotlib producing empty graphs is hard to debug
  • FIXTHIS - should make plot API much easier, or at least document the rules better

priorities for 1.1 release [edit section]

item  ^ priority  ^ notes  ^
add nod and job scripts to ftc high
automatically create a view for each board medium use a command line option on fuego-create-node (--make_view)
eliminate DISTRIB medium automatically detect, and use board vars
ensure logs are linked even for failed tests high if log exists, there should be link, failure or not
fix ftc device dictionary defer used for board check and pre-populating options
handle set -e differently medium <empty cell>
handle timeouts automatically medium should be in test and board, not in plan - but depends on spec, transport, build time etc
have test do description-setting medium
move /fuego-ro/conf/boards to /fuego-ro/boards medium
ov_transport_connect/ov_transport_disconnect medium Sounds like Jan-Simon will do it
put post_test process kill in test_cleanup high need optional test_cleanup, that calls kill_procs, that calls ov_rootfs_kill
rename all scripts to fuego_test.sh low
support ftc run-test low BUILD_TIMESTAMP won't work with separate post_test, but we're not doing that
support phase reordering defer
support separate build phase defer don't want to hold board reserved during build, for lava integration and handling timeouts better
testing by Daniel high not available until next week
testing by Jan-Simon high not available until next week
use spec name instead of plan name in job name medium Use <board>.<spec>.<testname>
web file transport defer could be useful for some interesting target cases

post_test work [edit section]

List of functions that have post_test arguments:
  • Benchmark.netpipe: kills iperf (should be NPtcp)
    • netpipe test doesn't use benchmark.sh!! (there is no call to post_test)
  • Functional.LTP.Open_Posix: kills run-posix-option-group-tst.sh run-tests.sh run-test
    • this test is not in the 'next' branch
  • Functional.netperf: kills netperf
    • netperf doesn't use functional.sh!! (there is no call to post_test)
  • Functional.LTP.Filesystem: kills run-test
    • this test is not in the 'next' branch
  • (OK) Benchmark.fio: fills fio
  • Benchmark.OpenSSL: kills openssl
    • Openssl doesn't use benchmark.sh (there is no call to post_test)
  • (OK) Benchmark.lmbench2: kills lmbench lat_mem_rd par_mem
  • (OK) Benchmark.IOzone: kills iozone
  • (OK) Benchmark.iperf: kills iperf
  • Benchmark.Java: kills java
    • put in test_cleanup, but didn't call kill_procs
    • there could be other java programs running - kill string is too generic
  • (OK) Benchmark.Interbench: kills interbench
  • (OK) Benchmark.gtkperf: kills gtkperf

fix functions that need to call post_test [edit section]

  • Benchmark.netpipe
  • Functional.netperf
  • Benchmark.OpenSSL

TBWiki engine 1.8.3 by Tim Bird