Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: link to new rpm and local env docs

TODO:

  • Add section and instructions for generating a public/private key pair and using ssh-copy-id to install it on all machines you will be using in the test suite.
  • Add instructions on how to add passwordless sudo on the accounts the launcher will be run on for the target hosts.
  • Add instructions based on Distro of which files to copy from utils/systemd into /usr/lib/systemd to allow the test harness to use systemctl.
  • Move these instructions into a file doc/dev/testing.md to make it accessible to other developers.
  • Add examples for invoking single tests
  • Add instructions on modifying your PYTHONPATH for the tests to have access to the daos python bindings in the in-tree build. 
  • Tests are run using a test framework called Avocado.  The version known to work with the existing tests is 69.3; if you yum install on Boro that should be the version you get.
  • You need to install it on all the cluster nodes that are going to be used to run tests.
  • Specifically:
    • yum install python3-pip 
    • pip3 install --upgrade pip
    • pip3 install "avocado-framework<70.0"
    • pip3 install "avocado-framework-plugin-result-html<70.0"
    • pip3 install "avocado-framework-plugin-varianter-yaml-to-mux<70.0"
    • pip install gitpython (only needed for IOR build)
    • pip install pathlib
    • pip3 install clustershell
  • DAOS can be installed using DAOS RPM package, also can be built locally following the instructions in the quickstart.md fileTo install and run tests from RPMs, see Manually Running Functional Tests from RPMs.
  • To install and run tests from a local build, see Use the local workspace for running functional tests.
  • Tests are in $PREFIX/lib/daos/TESTING/ftest, in .
    • In an RPM env, $PREFIX=/usr
    ; in
    • In local
    built
    • build env, $PREFIX=myrepo/install.
      At the top level is launch.py which simplifies running DAOS with avocado. 
  • Sub-directories contain categories of tests e. E.g. the pool directory contains tests relating related to pools.
  • A given group of tests is implemented as a combination of a python file (.py) and a yaml file (.yaml).   So for For example, in the pool directory there is a SimpleCreateDeleteTest.py and SimplecreateDeleteTest.yaml.  The python contains the test code that drives the test, and the yaml file contains the test parameters.  A single function in the python file can execute a large number of test cases because its it's run for different combinations of inputs as found in the yaml file.
  • For tests that run on multiple servers, these two files need to be accessible by remote server nodes:
    • daos_avocado_test.yaml
    • daos_server.attach_info_tmp

           In a local built build env, they are located in myrepo/install/tmp.

           In RPM env, they are located at DAOS_TEST_SHARED_DIR if this env variable is set, ; otherwise,  by default thy will be installed at ~/daos_test , this This directory will be automatically created if it does not existedexist..

  • Before running a test, you must identify the number of cluster nodes that are to be usedrequired.  To do this you edit , open the yaml file for the test you want to run ($PREFIX/lib/daos/TESTING/ftest/*/*.yaml).  In the yaml file you’ll You’ll see placeholders for machine names e. E.g. boroserver-A.  If the test requires more than 1 host you’ll see boroserver-A and boroserver-B, etc. using the yaml array syntax (e.g. each array item begins with a dash).  Edit these names replacing the A, B, … with a  real cluster node e.g. boro-17 that you have reserved.  You can replace the whole name as well, e.g. boro-A becomes wolf-18.  Depending on how many tests you want to run you may need to edit a number of the yaml filesYou'll need to have enough servers/clients depending on how many are specified in the yaml.
  • Tests are started with the launch.py script in the $PREFIX/lib/daos/TESTING/ftest directory along with a test tag.  Groups of tests are identified by tags. 

    Using the SimpleCreateDeleteTest as an example again –tests in this file are tagged with ‘simplecreate’.  So to run these tests you would enter: ./launch.py simplecreate.   Tests are given multiple tags of increasing specificity.  So SimpleCreateDeleteTest includes the simplecreate tag (most specific) but also the pool tag (least specific).  The simplecreate tag is presently associated with dozens of test cases, running the pool tag would run all the pool test cases currently around 1000. 

    For example, tests in $PREFIX/lib/daos/TESTING/ftest/pool/simple_create_delete_test.py are tagged with allpool, and simple_create. The all tag is associated with all tests, the pool tag is associated with most pool-related tests, and the simple_create tag is associated with only the tests in this file. To run the simple_create tests, execute:

    Code Block
    languagebash
    ./launch.py -ts boro-xx -tc boro-xx simple_create

    See Using launch.py for more details on running tests.

    Info

    If you were to make the tragic mistake of specifying the all tag, the tests would run for a day at least.

    If you are running a test to reproduce a defect, the writer of the defect will provide the name of the yaml file to edit and the tag to run the test.




TODO:

  • Add section and instructions for generating a public/private key pair and using ssh-copy-id to install it on all machines you will be using in the test suite.
  • Add instructions on how to add passwordless sudo on the accounts the launcher will be run on for the target hosts.
  • Add instructions based on Distro of which files to copy from utils/systemd into /usr/lib/systemd to allow the test harness to use systemctl and how to modify them to run as your user.
  • Move these instructions into a file doc/dev/testing.md to make it accessible to other developers.
  • Add instructions on creating /etc/daos paths needed since test harness does not use sudo on mkdir commands even though it uses sudo on cp commands.
  • Add instructions to create all of the necessary groups needed by the unit files