HDF5 VOL Connector

This guide provides brief instructions on how to build the DAOS VOL connector from source.

Prerequisites

  • MPI Library (make sure it is in your PATH and LD_LIBRARY_PATH)

  • cmake version >= 3.15.0

  • DAOS >= v2.0

Build HDF5 1.13.1

git clone -b hdf5-1_13_1 https://github.com/HDFGroup/hdf5.git cd hdf5; ./autogen.sh mkdir build; cd build; ./configure --prefix=/path/to/install/hdf5 --enable-parallel --enable-map-api make -j32 install 

Add HDF5 to your PATH and LD_LIBRARY_PATH:

export LD_LIBRARY_PATH=/path/to/install/hdf5/lib:$LD_LIBRARY_PATH export PATH=/path/to/install/hdf5/bin:$PATH

Build the VOL Connector

git clone -b v1.1.0 https://github.com/HDFGroup/vol-daos cd vol-daos; mkdir build; cd build ccmake ..

Using ccmake start the configure process. at this point, one needs to configure (c) then e and edit a few things especially if things like DAOS are not installed in default locations. Look at editing these variables in the ccmake configuration:

Once all those are set, press g to generate the configure.

Otherwise, one could use regular cmake on the command line:

finally to build:

Using the DAOS VOL Connector

By default, the HDF5 library will use the native connector. To use the DAOS VOL connector, one needs to set the following environment variables in the client environment:

The DAOS VOL requires a DAOS UNS capable file system that it can query the pool information from and store the stub files that contain extended attributes that point to the HDF5 files (DAOS containers) created. This file system can be something like dfuse exported with a DAOS container.

To bypass the UNS and this requirement, one can use environment variables to set the DAOS pool label to use. In that case, the containers created by the DAOS VOL are not visible in a normal file system path, and one would need to query them using the daos tool with listing the containers in the pool. This mode is recommended for basic testing and not in production use: