Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 17 Next »

  • Tests are run using a test framework called Avocado.  The version known to work with the existing tests is 69.3; if you yum install on Boro that should be the version you get.
  • You need to install it on all the cluster nodes that are going to be used to run tests.
  • Specifically:
    • yum install python3-pip 
    • pip3 install --upgrade pip
    • pip3 install "avocado-framework<70.0"
    • pip3 install "avocado-framework-plugin-result-html<70.0"
    • pip3 install "avocado-framework-plugin-varianter-yaml-to-mux<70.0"
    • pip install gitpython (only needed for IOR build)
    • pip install pathlib
    • pip3 install clustershell
  • DAOS can be installed using DAOS RPM package, also can be built locally following the instructions in the quickstart.md file.
  • Tests are in $PREFIX/lib/daos/TESTING/ftest, in RPM env, $PREFIX=/usr; in local built env, $PREFIX=myrepo/install.  At the top level is launch.py which simplifies running DAOS with avocado.  Sub-directories contain categories of tests e.g. the pool directory contains tests relating to pools.
  • A given group of tests is implemented as a combination of a python file (.py) and a yaml file (.yaml).  So for example in the pool directory there is a SimpleCreateDeleteTest.py and SimplecreateDeleteTest.yaml.  The python contains the test code that drives the test, and the yaml file contains the test parameters.  A single function in the python file can execute a large number of test cases because its run for different combinations of inputs as found in the yaml file.
  • For tests that run on multiple servers, these two files need to be accessible by remote server nodes:
    • daos_avocado_test.yaml
    • daos_server.attach_info_tmp

           In local built env, they are located in myrepo/install/tmp.

           In RPM env, they are located at DAOS_TEST_SHARED_DIR if this env variable is set, otherwise,  by default will be installed at ~/daos_test, this directory will be automatically created if not existed.

  • Before running a test you must identify the cluster nodes that are to be used.  To do this you edit the yaml file for the test you want to run ($PREFIX/lib/daos/TESTING/ftest/*/*.yaml).  In the yaml file you’ll see placeholders for machine names e.g. boro-A.  If the test requires more than 1 host you’ll see boro-A and boro-B, etc. using the yaml array syntax (e.g. each array item begins with a dash).  Edit these names replacing the A, B, … with a  real cluster node e.g. boro-17 that you have reserved.  You can replace the whole name as well, e.g. boro-A becomes wolf-18.  Depending on how many tests you want to run you may need to edit a number of the yaml files.
  • Tests are started with the launch.py script in the $PREFIX/lib/daos/TESTING/ftest directory along with a test tag.  Groups of tests are identified by tags.  Using the SimpleCreateDeleteTest as an example again –tests in this file are tagged with ‘simplecreate’.  So to run these tests you would enter: ./launch.py simplecreate.   Tests are given multiple tags of increasing specificity.  So SimpleCreateDeleteTest includes the simplecreate tag (most specific) but also the pool tag (least specific).  The simplecreate tag is presently associated with dozens of test cases, running the pool tag would run all the pool test cases currently around 1000.  If you were to make the tragic mistake of specifying the all tag the tests would run for a day at least.
  • If you are running a test to reproduce a defect, the writer of the defect will provide the name of the yaml file to edit and the tag to run the test.



TODO:

  • Add section and instructions for generating a public/private key pair and using ssh-copy-id to install it on all machines you will be using in the test suite.
  • Add instructions on how to add passwordless sudo on the accounts the launcher will be run on for the target hosts.
  • Add instructions based on Distro of which files to copy from utils/systemd into /usr/lib/systemd to allow the test harness to use systemctl.
  • Move these instructions into a file doc/dev/testing.md to make it accessible to other developers.
  • Add and update examples for invoking tests as the description is no longer accurate.
  • Add instructions on modifying your PYTHONPATH for the tests to have access to the daos python bindings in the in-tree build. 
  • Add instructions on creating /etc/daos paths needed since test harness does not use sudo on mkdir commands even though it uses sudo on cp commands.
  • Add instructions to create all of the necessary groups needed by the unit files
getent group daos_metrics >/dev/null || groupadd -r daos_metrics
getent group daos_server >/dev/null || groupadd -r daos_server
getent passwd daos_server >/dev/null || useradd -s /sbin/nologin -r -g daos_server -G daos_metrics daos_server
getent group daos_agent >/dev/null || groupadd -r daos_agent
getent passwd daos_agent >/dev/null || useradd -s /sbin/nologin -r -g daos_agent daos_agent
  • No labels