Giter Site home page Giter Site logo

vol-daos's Introduction

HDF5 DAOS VOL connector

Latest version

Table of Contents

  1. Description
  2. Installation
  3. Testing and Usage
  4. More information

1. Description

The HDF5 DAOS VOL connector is a Virtual Object Layer (VOL) connector for HDF5 that allows for direct interfacing with the Distributed Asynchronous Object Storage (DAOS) system, bypassing both MPI I/O and POSIX for efficient and scalable I/O, removing the limitations of the native HDF5 file format and enabling new features such as independent creation of objects in parallel, key-value store objects, data recovery, asynchronous I/O, etc.

Applications already using HDF5 can, using this VOL connector and a DAOS-enabled system, benefit of some of these features with minimal code changes. The connector is built as a plugin library that is external to HDF5, meaning that it must be dynamically-loaded in the same fashion as HDF5 filter plugins.

2. Installation

Below is set a of instructions that is compiled to provide a minimal installation of the DAOS VOL connector on a DAOS-enabled system.

Prerequisites

To build the DAOS VOL connector, the following libraries are required:

  • libhdf5 - The HDF5 library. Minimum version required is 1.14.0, compiled with support for both parallel I/O and map objects (i.e., -DHDF5_ENABLE_MAP_API=ON CMake option).

  • libdaos - The DAOS library. Minimum version required is 1.3.106-tb.

  • libuuid - UUID support.

Compiled libraries must either exist in the system's library paths or must be pointed to during the DAOS VOL connector build process.

Build instructions

The HDF5 DAOS VOL connector is built using CMake. CMake version 2.8.12.2 or greater is required for building the connector itself, but version 3.1 or greater is required to build the connector's tests.

If you install the full sources, put the tarball in a directory where you have permissions (e.g., your home directory) and unpack it:

gzip -cd hdf5_vol_daos-X.tar.gz | tar xvf -

or

bzip2 -dc hdf5_vol_daos-X.tar.bz2 | tar xvf -

Replace 'X' with the version number of the package.

After obtaining the connector's source code, you can create a build directory within the source tree and run the ccmake or cmake command from it:

cd hdf5_vol_daos-X
mkdir build
cd build
ccmake ..

If using ccmake, type 'c' multiple times and choose suitable options or if using cmake, pass these options with -D. Some of these options may be needed if, for example, the required components mentioned previously are not located in default paths.

Setting include directory and library paths may require you to toggle to the advanced mode by typing 't'. Once you are done and do not see any errors, type 'g' to generate makefiles. Once you exit the CMake configuration screen and are ready to build the targets, do:

make

Verbose build output can be generated by appending VERBOSE=1 to the make command.

Assuming that the CMAKE_INSTALL_PREFIX has been set and that you have write permissions to the destination directory, you can install the connector by simply doing:

 make install

CMake options

  • CMAKE_INSTALL_PREFIX - This option controls the install directory that the resulting output files are written to. The default value is /usr/local.
  • CMAKE_BUILD_TYPE - This option controls the type of build used for the VOL connector. Valid values are Release, Debug, RelWithDebInfo, MinSizeRel, Ubsan, Asan; the default build type is RelWithDebInfo.

Connector options

  • BUILD_TESTING - This option is used to enable/disable building of the DAOS VOL connector's tests. The default value is ON.
  • BUILD_EXAMPLES - This option is used to enable/disable building of the DAOS VOL connector's HDF5 examples. The default value is OFF.
  • HDF5_C_COMPILER_EXECUTABLE - This option controls the HDF5 compiler wrapper script used by the VOL connector build process. It should be set to the full path to the HDF5 compiler wrapper (usually bin/h5cc), including the name of the wrapper script. The following two options may also need to be set.
  • HDF5_C_LIBRARY_hdf5 - This option controls the HDF5 library used by the VOL connector build process. It should be set to the full path to the HDF5 library, including the library's name (e.g., /path/libhdf5.so). Used in conjunction with the HDF5_C_INCLUDE_DIR option.
  • HDF5_C_INCLUDE_DIR - This option controls the HDF5 include directory used by the VOL connector build process. Used in conjunction with the HDF5_C_LIBRARY_hdf5 variable.
  • DAOS_LIBRARY - This option controls the DAOS library used by the VOL connector build process. It should be set to the full path to the DAOS library, including the library's name (e.g., /path/libdaos.so). Used in conjunction with the DAOS_UNS_LIBRARY and DAOS_INCLUDE_DIR options.
  • DAOS_UNS_LIBRARY - This option controls the DAOS unified namespace library used by the VOL connector build process. It should be set to the full path to the DAOS libduns library, including the library's name (e.g., /path/libduns.so). Used in conjunction with the DAOS_LIBRARY and DAOS_INCLUDE_DIR options.
  • DAOS_INCLUDE_DIR - This option controls the DAOS include directory used by the VOL connector build process. Used in conjunction with the DAOS_LIBRARY and DAOS_UNS_LIBRARY options.
  • MPI_C_COMPILER - This option controls the MPI C compiler used by the VOL connector build process. It should be set to the full path to the MPI C compiler, including the name of the executable.

3. Testing and Usage

In the connector, each chunk is stored in a different DAOS dkey, and data in a single dkey is stored in a single DAOS storage target. Therefore, splitting the data into different chunks stripes the data across different dkeys and different storage targets. This improves I/O performance by allowing DAOS to read from or write to multiple storage targets at once.

The bandwidth improvement from using different storage targets is so vital that, if h5pset_chunk() is not used, i.e., contiguous datasets, the connector will automatically set a chunk size. The connector, by default, tries to size these chunks to approximately 1 MiB. The environment variable HDF5_DAOS_CHUNK_TARGET_SIZE (in bytes) sets the chunk target size. Setting this variable to 0 disables automatic chunking, and contiguous datasets will stay contiguous (and will therefore only be stored on a single storage target). Better performance may be obtained by choosing a larger chunk target size, such as 4-8 MiB.

For further information on how to use the DAOS VOL connector with an HDF5 application, as well as how to test that the VOL connector is functioning properly, please refer to the DAOS VOL User's Guide under docs/users_guide.pdf.

4. More Information

DAOS VOL

Design documentation for the DAOS VOL can be found under docs/design_doc.pdf.

Journal paper:

  • J. Soumagne, J. Henderson, M. Chaarawi, N. Fortner, S. Breitenfeld, S. Lu, D. Robinson, E. Pourmal, J. Lombardi, "Accelerating HDF5 I/O for Exascale Using DAOS," in IEEE Transactions on Parallel and Distributed Systems, vol. 33, no. 4, pp. 903-914, April 2022. | paper |

DAOS

DAOS installation and usage instructions can be found on the DAOS website: https://docs.daos.io/

vol-daos's People

Contributors

berserk-fury avatar brianjmurrell avatar brtnfld avatar derobins avatar fortnern avatar hyoklee avatar jhendersonhdf avatar lkurz avatar mchaarawi avatar raylu-hdf avatar soumagne avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

vol-daos's Issues

Using vol connector via environment variable without application modifications

Hi there,

I'm trying to figure out how the VOL connector can be used for existing HDF5 applications. According to the User's Guide using the environment variable approach looks best for this use case:

The environment variable is useful when the application either cannot or does not need to be modified to use
DAOS storage (e.g., the application makes no native-specific HDF5 API calls). It is also the easiest to use as
it requires very little setup and no changes to the application’s source code.

From my understanding, very simple applications should not need any modification. Thus, I tried to run the h5_write example from the HDF5 main repository which I compiled using h5pcc. Following the VOL User's Guide, I set the HDF5_VOL_CONNECTOR=daos and HDF5_PLUGIN_PATH to the library path and run the binary but it fails:

# starting it with mpiexec or not doesn't make a difference here.
$ mpiexec -n 1 ./h5_write
HDF5-DIAG: Error detected in HDF5 (1.12.1-3) MPI-process 0:
  #000: H5T.c line 1886 in H5Tcopy(): obj_id is not a datatype ID
    major: Invalid arguments to routine
    minor: Inappropriate type
DAOS VOL-DIAG: Error detected in DAOS VOL (1.0.0) MPI-process 0:
  #000: /home/centos/install/hdf5/vol-daos/src/daos_vol_dset.c line 559 in H5_daos_dataset_create(): dataset create failed in task "default (probably operation setup)": error during operation setup (H5_DAOS_SETUP_ERROR)
    major: Dataset
    minor: Can't operate on object
  #001: /home/centos/install/hdf5/vol-daos/src/daos_vol.c line 3287 in H5_daos_h5op_finalize(): operation "dataset create" failed in task "default (probably operation setup)": error during operation setup (H5_DAOS_SETUP_ERROR)
    major: Low-level I/O
    minor: Unable to initialize object
  #002: /home/centos/install/hdf5/vol-daos/src/daos_vol_dset.c line 492 in H5_daos_dataset_create(): can't create dataset
    major: Dataset
    minor: Unable to initialize object
  #003: /home/centos/install/hdf5/vol-daos/src/daos_vol_dset.c line 641 in H5_daos_dataset_create_helper(): failed to copy datatype
    major: Dataset
    minor: Unable to copy object
HDF5-DIAG: Error detected in HDF5 (1.12.1-3) MPI-process 0:
  #000: H5D.c line 141 in H5Dcreate2(): unable to create dataset
    major: Dataset
    minor: Unable to initialize object
  #001: H5VLcallback.c line 1793 in H5VL_dataset_create(): dataset create failed
    major: Virtual Object Layer
    minor: Unable to create file
  #002: H5VLcallback.c line 1758 in H5VL__dataset_create(): dataset create failed
    major: Virtual Object Layer
    minor: Unable to create file

The "file" itself is created in the DAOS pool as well as the root group within the file/container. I confirmed that with h5dump. However, the dataset creation fails as shown above. I can fix the example by adding calls to H5Pcreate, H5Pset_fapl_daos, and H5Pset_all_coll_metadata_ops as shown in the User's Guide and link against the vol connector. This, however, appears not to be the intention behind the environment variable approach.

I also tried the ph5_example but without any modifications. Again, the application crashes:

# mpiexec -n 2 ./ph5_example
Parallel test files are:
   ./ParaEg0.h5
   ./ParaEg1.h5
--------------------------------
Proc 0: --------------------------------
Proc 1: *** testing PHDF5 dataset using split communicators...
--------------------------------
Independent write test on file ./ParaEg0.h5 ./ParaEg1.h5
*** testing PHDF5 dataset using split communicators...
--------------------------------
Independent write test on file ./ParaEg0.h5 ./ParaEg1.h5
--------------------------------
Proc 1: *** testing PHDF5 dataset independent write...
--------------------------------
Independent write test on file ./ParaEg0.h5
H5Pcreate access succeed
HDF5-DIAG: Error detected in HDF5 (1.12.1-3) MPI-process 0:
  #000: H5P.c line 105 in H5Pcopy(): property object doesn't exist
    major: Property lists
    minor: Object not found
DAOS VOL-DIAG: Error detected in DAOS VOL (1.0.0) MPI-process 0:
  #000: /home/centos/install/hdf5/vol-daos/src/daos_vol_file.c line 835 in H5_daos_file_create(): failed to copy fapl
    major: File accessibility
    minor: Unable to copy object
HDF5-DIAG: Error detected in HDF5 (1.12.1-3) MPI-process 0:
  #000: H5F.c line 539 in H5Fcreate(): unable to create file
    major: File accessibility
    minor: Unable to open file
  #001: H5VLcallback.c line 3269 in H5VL_file_create(): file create failed
    major: Virtual Object Layer
    minor: Unable to create file
  #002: H5VLcallback.c line 3235 in H5VL__file_create(): file create failed
    major: Virtual Object Layer
    minor: Unable to create file
ph5example: ph5example.c:877: test_split_comm_access: Assertion `fid != -1' failed.

Based on these two examples, I am not sure which restrictions or requirements apply to an HDF5 application that should be run with the DAOS VOL connector. At the moment it appears that using the connector via the environment variable does not appear to work with (too) simple applications. The examples from the connector all make the H5P calls mentioned above and require them to be linked with the connector. This applies to most of the DAOS VOL tests as well. Thus, the environment variables are not required for them to work.

I wonder if this is a mismatch between the implementation and the documentation. Nevertheless; it would be great if an example could be provided that contains a minimal application which makes use the environment variable approach.

`h5dump` fails to dump ` h5daos_test_map.h5`.

h5dump fails on Sunspot.

hyoklee@uan-0002:~/vol-daos/build/bin> ./h5daos_test_map
container ERR  src/container/cli.c:371 cont_destroy_complete() failed to destroy \
container: DER_NONEXIST(-1005): 'The specified entity does not exist'
Testing integer as the datatype of keys and values
    Testing creation of map object                                        PASSED

...

All DAOS Map tests passed

...

hyoklee@uan-0002:~/vol-daos/build/bin> h5dump h5daos_test_map.h5
HDF5 "h5daos_test_map.h5" {
GROUP "/" {
   DATATYPE "key_datatype" H5T_ENUM {
      H5T_STD_I32LE;
      "ONE"              0;
      "TWO"              1;
      "THREE"            2;
      "FOUR"             3;
      "FIVE"             4;
   };h5dump error: unknown object "map_comp_comp"

}
}

h5dump prints output although file doesn't exist.

This happens when you define HDF5_DAOS_BYPASS_DUNS=YES.

hyoklee@uan-0002:~> h5dump repack_96_dsets.h5 
HDF5 "repack_96_dsets.h5" {
GROUP "/" {
}
}
hyoklee@uan-0002:~> ls repack_96_dsets.h5 
ls: cannot access 'repack_96_dsets.h5': No such file or directory

unneccessary check for pool_grp/svcl being NULL in H5daos_init (breaks examples)

Hi,

in H5daos_init there are checks that the DAOS pool and SVCL aren't NULL. However, this appears to be inconsistent with that the connector is actually able to accept:

vol-daos/src/daos_vol.c

Lines 384 to 387 in 95772a5

if(NULL == pool_grp)
D_GOTO_ERROR(H5E_ARGS, H5E_BADVALUE, FAIL, "not a valid service group");
if(NULL == pool_svcl)
D_GOTO_ERROR(H5E_ARGS, H5E_BADVALUE, FAIL, "not a valid service list");

A little bit later, the (unmodified, since const) values of pool_grp and pool_svcl arguments are passed to H5_daos_set_pool_globals:

if(H5_daos_set_pool_globals(pool_grp, pool_svcl) < 0)

However, H5_daos_set_pool_globals accepts the two parameters being NULL and the function uses the default values or values passed via environment variables in that case, which is quiet convenient.

The issue in H5daos_init actually breaks allmost if not all of the examples which initalize the group to NULL:

char *pool_grp = NULL;

One ends up with an error like this:

DAOS VOL-DIAG: Error detected in DAOS VOL (1.0.0) MPI-process 0:
  #000: /home/centos/install/hdf5/vol-daos/src/daos_vol.c line 385 in H5daos_init(): not a valid service group
    major: Invalid arguments to routine
    minor: Bad value
*FAILED*
        at h5dsm_file_open.c:21 in main()...

I suggest to remove the checks mentioned above as the NULL values are actually accepted by the routine they are passed to.

Defining installation paths

I'm not very familiar with cmake (much more so with GNU Autotools), so if the answer to my question is buried in there somewhere, please forgive me.

Can I define the path where the libraries, pkg-config files, etc. are installed? They seem to want to go into /usr/lib by default but I'm not finding the knob to make them install somewhere else.

HDF5 DAOS VOL creates containers of type type "unknown" and label "container_label_not_set". Type and label needs to be fixed

Following https://daosio.atlassian.net/wiki/spaces/DC/pages/11138695214/HDF5+VOL+Connector

on bypassing the UNS requirement with

export HDF5_DAOS_BYPASS_DUNS=1
export DAOS_POOL=pool_label

creates a container of type unknown and label container_label_not_set

unlike, without bypassing the UNS requirement with unset HDF5_DAOS_BYPASS_DUNS

creates a container of type HDF5 and label container_label_not_set


kaushikvelusamy@x1921c0s0b0n0:/lus/gila/projects/CSC250STDM10_CNDA/kaushik/experiments/daos_h5bench/vol/kaushik/iteration1> daos cont list datascience
UUID                                 Label        
----                                 -----        
78104b7a-48e6-4f2b-9a6c-2382532a8f67 dlio         
ff0889f5-db03-4bac-b5f8-9c8ddacc2fa4 kau_pos_cont 
d6b2e72a-a016-41b9-9978-20774395504d resnet_data  
808ce1e3-08ef-479c-97ee-873387de07cc ml_datasets  
kaushikvelusamy@x1921c0s0b0n0:/lus/gila/projects/CSC250STDM10_CNDA/kaushik/experiments/daos_h5bench/vol/kaushik/iteration1> rm -rf /tmp/datascience/kau_pos_cont/*
kaushikvelusamy@x1921c0s0b0n0:/lus/gila/projects/CSC250STDM10_CNDA/kaushik/experiments/daos_h5bench/vol/kaushik/iteration1>  unset HDF5_DAOS_BYPASS_DUNS
kaushikvelusamy@x1921c0s0b0n0:/lus/gila/projects/CSC250STDM10_CNDA/kaushik/experiments/daos_h5bench/vol/kaushik/iteration1> h5bench -d 2_small_indep.json 
...
kaushikvelusamy@x1921c0s0b0n0:/lus/gila/projects/CSC250STDM10_CNDA/kaushik/experiments/daos_h5bench/vol/kaushik/iteration1> daos cont list datascience
UUID                                 Label                   
----                                 -----                   
78104b7a-48e6-4f2b-9a6c-2382532a8f67 dlio                    
ff0889f5-db03-4bac-b5f8-9c8ddacc2fa4 kau_pos_cont            
b6f6ab30-8a21-473c-baa1-643f05664179 container_label_not_set 
d6b2e72a-a016-41b9-9978-20774395504d resnet_data             
808ce1e3-08ef-479c-97ee-873387de07cc ml_datasets             
kaushikvelusamy@x1921c0s0b0n0:/lus/gila/projects/CSC250STDM10_CNDA/kaushik/experiments/daos_h5bench/vol/kaushik/iteration1> daos cont query datascience b6f6ab30-8a21-473c-baa1-643f05664179 
  Container UUID             : b6f6ab30-8a21-473c-baa1-643f05664179
  Container Type             : HDF5                                
  Pool UUID                  : ea3abd65-dabb-4b45-84f3-3d29ab307686
  Number of snapshots        : 0                                   
  Latest Persistent Snapshot : 0x0                                 
  Container redundancy factor: 0                                   

kaushikvelusamy@x1921c0s0b0n0:/lus/gila/projects/CSC250STDM10_CNDA/kaushik/experiments/daos_h5bench/vol/kaushik/iteration1> export HDF5_DAOS_BYPASS_DUNS=1
kaushikvelusamy@x1921c0s0b0n0:/lus/gila/projects/CSC250STDM10_CNDA/kaushik/experiments/daos_h5bench/vol/kaushik/iteration1> DAOS_POOL=datascience
kaushikvelusamy@x1921c0s0b0n0:/lus/gila/projects/CSC250STDM10_CNDA/kaushik/experiments/daos_h5bench/vol/kaushik/iteration1> DAOS_CONT=kau_pos_cont 
kaushikvelusamy@x1921c0s0b0n0:/lus/gila/projects/CSC250STDM10_CNDA/kaushik/experiments/daos_h5bench/vol/kaushik/iteration1> 
kaushikvelusamy@x1921c0s0b0n0:/lus/gila/projects/CSC250STDM10_CNDA/kaushik/experiments/daos_h5bench/vol/kaushik/iteration1> export DAOS_POOL=$DAOS_POOL
kaushikvelusamy@x1921c0s0b0n0:/lus/gila/projects/CSC250STDM10_CNDA/kaushik/experiments/daos_h5bench/vol/kaushik/iteration1> export DAOS_CONT=$DAOS_CONT
kaushikvelusamy@x1921c0s0b0n0:/lus/gila/projects/CSC250STDM10_CNDA/kaushik/experiments/daos_h5bench/vol/kaushik/iteration1>  h5bench -d 2_small_indep.json 
kaushikvelusamy@x1921c0s0b0n0:/lus/gila/projects/CSC250STDM10_CNDA/kaushik/experiments/daos_h5bench/vol/kaushik/iteration1>  daos cont list datascience
UUID                                 Label                   
----                                 -----                   
78104b7a-48e6-4f2b-9a6c-2382532a8f67 dlio                    
ff0889f5-db03-4bac-b5f8-9c8ddacc2fa4 kau_pos_cont            
d6b2e72a-a016-41b9-9978-20774395504d resnet_data             
808ce1e3-08ef-479c-97ee-873387de07cc ml_datasets             
24b8db1f-1d34-5cbb-a54b-c585a8a827fa container_label_not_set 
kaushikvelusamy@x1921c0s0b0n0:/lus/gila/projects/CSC250STDM10_CNDA/kaushik/experiments/daos_h5bench/vol/kaushik/iteration1> daos cont query datascience 24b8db1f-1d34-5cbb-a54b-c585a8a827fa
  Container UUID             : 24b8db1f-1d34-5cbb-a54b-c585a8a827fa
  Container Type             : unknown                             
  Pool UUID                  : ea3abd65-dabb-4b45-84f3-3d29ab307686
  Number of snapshots        : 0                                   
  Latest Persistent Snapshot : 0x0                                 
  Container redundancy factor: 0                                   

kaushikvelusamy@x1921c0s0b0n0:/lus/gila/projects/CSC250STDM10_CNDA/kaushik/experiments/daos_h5bench/vol/kaushik/iteration1> 

h5daos_test_metadata_parallel crash while using Async I/O flag

Hi,

I am testing the h5daos_test_metadata_parallel test. When I ran it without using any flag it works. However, when I use the -U flag to use Async I/O it crashes.

Seen in:
DAOS commit b413447e
vol-daos v1.1.0rc2
and vol-daos commit 9ae044c

Reproduction steps

$ dmg  pool create --size 8G
Creating DAOS pool with automatic storage allocation: 8.0 GB NVMe + 6.00% SCM
Pool created with 100.00% SCM/NVMe ratio
-----------------------------------------
  UUID          : 6d9e2196-dcfe-4b78-82a9-8621a7ec87dd
  Service Ranks : 0
  Storage Ranks : 0
  Total Size    : 8.0 GB
  SCM           : 8.0 GB (8.0 GB / rank)
  NVMe          : 0 B (0 B / rank)

$ export D_LOG_FILE=/tmp/daos_client.log
$ export D_LOG_MASK=INFO
$ export DAOS_POOL=6d9e2196-dcfe-4b78-82a9-8621a7ec87dd
$ export HDF5_VOL_CONNECTOR=daos
$ export HDF5_PLUGIN_PATH=/usr/lib64/mpich/lib
$ export H5_DAOS_BYPASS_DUNS=TRUE

$ mpirun --hostfile ./hostfile -np 3 /home/eduardoj/repos/vol-daos/build/bin/h5daos_test_metadata_parallel -U

Here is the output of the command:

output_h5daos_test_metadata_parallel_U.txt

All tests fail on GitHub Action.

Server can't validate daos_server.yml script.

ERROR: /home/runner/work/vol-daos/vol-daos/build/test/scripts/daos_server.yml: validation failed: serverconfig: code = 726 description = "number of hugepages specified (-1) is out of range (0 - 2147483647)"
[778](https://github.com/hyoklee/vol-daos/actions/runs/4140208893/jobs/7158639104#step:8:779)
ERROR: serverconfig: code = 726 resolution = "specify a nr_hugepages value between 0 and 2147483647"
[779](https://github.com/hyoklee/vol-daos/actions/runs/4140208893/jobs/7158639104#step:8:780)
H5VLTestDriver: server never started.
[780](https://github.com/hyoklee/vol-daos/actions/runs/4140208893/jobs/7158639104#step:8:781)
H5VLTestDriver: Server never started.
[781](https://github.com/hyoklee/vol-daos/actions/runs/4140208893/jobs/7158639104#step:8:782)

[782](https://github.com/hyoklee/vol-daos/actions/runs/4140208893/jobs/7158639104#step:8:783)

[783](https://github.com/hyoklee/vol-daos/actions/runs/4140208893/jobs/7158639104#step:8:784)
0% tests passed, 10 tests failed out of 10
[784](https://github.com/hyoklee/vol-daos/actions/runs/4140208893/jobs/7158639104#step:8:785)

[785](https://github.com/hyoklee/vol-daos/actions/runs/4140208893/jobs/7158639104#step:8:786)
Total Test time (real) =   2.76 sec
[786](https://github.com/hyoklee/vol-daos/actions/runs/4140208893/jobs/7158639104#step:8:787)

[787](https://github.com/hyoklee/vol-daos/actions/runs/4140208893/jobs/7158639104#step:8:788)
The following tests FAILED:
[788](https://github.com/hyoklee/vol-daos/actions/runs/4140208893/jobs/7158639104#step:8:789)
	  1 - h5vl_test (Failed)
[789](https://github.com/hyoklee/vol-daos/actions/runs/4140208893/jobs/7158639104#step:8:790)
	  2 - h5_test_testhdf5 (Failed)
[790](https://github.com/hyoklee/vol-daos/actions/runs/4140208893/jobs/7158639104#step:8:791)
	  3 - h5vl_ext_h5daos_test_map (Failed)
[791](https://github.com/hyoklee/vol-daos/actions/runs/4140208893/jobs/7158639104#step:8:792)
	  4 - h5vl_ext_h5daos_test_oclass (Failed)
[792](https://github.com/hyoklee/vol-daos/actions/runs/4140208893/jobs/7158639104#step:8:793)
	  5 - h5vl_ext_h5daos_test_recovery (Failed)
[793](https://github.com/hyoklee/vol-daos/actions/runs/4140208893/jobs/7158639104#step:8:794)
	  6 - h5vl_test_parallel (Failed)
[794](https://github.com/hyoklee/vol-daos/actions/runs/4140208893/jobs/7158639104#step:8:795)
	  7 - h5_partest_t_bigio (Failed)
[795](https://github.com/hyoklee/vol-daos/actions/runs/4140208893/jobs/7158639104#step:8:796)
	  8 - h5_partest_t_pshutdown (Failed)
[796](https://github.com/hyoklee/vol-daos/actions/runs/4140208893/jobs/7158639104#step:8:797)
	  9 - h5_partest_t_shapesame (Failed)
[797](https://github.com/hyoklee/vol-daos/actions/runs/4140208893/jobs/7158639104#step:8:798)
	 10 - h5_partest_testphdf5 (Failed)
[798](https://github.com/hyoklee/vol-daos/actions/runs/4140208893/jobs/7158639104#step:8:799)
Error: Process completed with exit code 8.

test/daos_vol/h5daos_test_map.c compilation failure

I'm testing with HDF5 1.14.0.

/usr/bin/ld: CMakeFiles/h5daos_test_map.dir/h5daos_test_map.c.o: in function `test_nonexistent_map':
[738](https://github.com/hyoklee/vol-daos/actions/runs/4135797767/jobs/7148753782#step:8:739)
/home/runner/work/vol-daos/vol-daos/test/daos_vol/h5daos_test_map.c:4579: undefined reference to `H5Mopen'
[739](https://github.com/hyoklee/vol-daos/actions/runs/4135797767/jobs/7148753782#step:8:740)
/usr/bin/ld: /home/runner/work/vol-daos/vol-daos/test/daos_vol/h5daos_test_map.c:4598: undefined reference to `H5Mclose'
[740](https://github.com/hyoklee/vol-daos/actions/runs/4135797767/jobs/7148753782#step:8:741)
/usr/bin/ld: ../../bin/libhdf5_vol_daos.so.1.2.0: undefined reference to `H5Pget_map_iterate_hints'
[741](https://github.com/hyoklee/vol-daos/actions/runs/4135797767/jobs/7148753782#step:8:742)
collect2: error: ld returned 1 exit status
[742](https://github.com/hyoklee/vol-daos/actions/runs/4135797767/jobs/7148753782#step:8:743)
make[2]: *** [test/daos_vol/CMakeFiles/h5daos_test_map.dir/build.make:103: bin/h5daos_test_map] Error 1
[743](https://github.com/hyoklee/vol-daos/actions/runs/4135797767/jobs/7148753782#step:8:744)
make[1]: *** [CMakeFiles/Makefile2:981: test/daos_vol/CMakeFiles/h5daos_test_map.dir/all] Error 2
[744](https://github.com/hyoklee/vol-daos/actions/runs/4135797767/jobs/7148753782#step:8:745)
make: *** [Makefile:166: all] Error 2
[745](https://github.com/hyoklee/vol-daos/actions/runs/4135797767/jobs/7148753782#step:8:746)
Error: Process completed with exit code 2.

Error: unknown option "-genv"

CTest fails with the following error:

Test project /home/runner/work/vol-daos/vol-daos/build
[493](https://github.com/hyoklee/vol-daos/actions/runs/4137564959/jobs/7152893462#step:8:494)
      Start  1: h5vl_test
[494](https://github.com/hyoklee/vol-daos/actions/runs/4137564959/jobs/7152893462#step:8:495)
 1/10 Test  #1: h5vl_test ........................***Failed    0.01 sec
[495](https://github.com/hyoklee/vol-daos/actions/runs/4137564959/jobs/7152893462#step:8:496)
rm -rf /mnt/daos/* /home/runner/work/vol-daos/vol-daos/build/bin/*.h5
[496](https://github.com/hyoklee/vol-daos/actions/runs/4137564959/jobs/7152893462#step:8:497)
Test Server
[497](https://github.com/hyoklee/vol-daos/actions/runs/4137564959/jobs/7152893462#step:8:498)
This is a serial test
[498](https://github.com/hyoklee/vol-daos/actions/runs/4137564959/jobs/7152893462#step:8:499)
The allow server errors in output flag was set to 1
[499](https://github.com/hyoklee/vol-daos/actions/runs/4137564959/jobs/7152893462#step:8:500)
Client Helper
[500](https://github.com/hyoklee/vol-daos/actions/runs/4137564959/jobs/7152893462#step:8:501)
Client Init
[501](https://github.com/hyoklee/vol-daos/actions/runs/4137564959/jobs/7152893462#step:8:502)
H5VLTestDriver: server command is:
[502](https://github.com/hyoklee/vol-daos/actions/runs/4137564959/jobs/7152893462#step:8:503)
 "/usr/bin/mpiexec" "-n" "1" "-genv" "DAOS_DISABLE_REQ_FWD" "1" "/usr/local/bin/daos_server" "start" "--recreate-superblocks" "-o" "/home/runner/work/vol-daos/vol-daos/build/test/scripts/daos_server.yml"
[503](https://github.com/hyoklee/vol-daos/actions/runs/4137564959/jobs/7152893462#step:8:504)
H5VLTestDriver: client_helper command is:
[504](https://github.com/hyoklee/vol-daos/actions/runs/4137564959/jobs/7152893462#step:8:505)
 "env" "D_LOG_FILE=/home/runner/work/vol-daos/vol-daos/build/test/daos_client.log" "D_LOG_MASK=ERR" "CRT_PHY_ADDR_STR=ofi+sockets" "OFI_INTERFACE=lo" "DAOS_AGENT_DRPC_DIR=/home/runner/work/vol-daos/vol-daos/build/test" "HDF5_PLUGIN_PATH=/home/runner/work/vol-daos/vol-daos/build/bin" "HDF5_VOL_CONNECTOR=daos" "HDF5_DAOS_CHUNK_TARGET_SIZE=1073741824" "/home/runner/work/vol-daos/vol-daos/build/test/scripts/daos_agent.sh"
[505](https://github.com/hyoklee/vol-daos/actions/runs/4137564959/jobs/7152893462#step:8:506)
H5VLTestDriver: client_init command is:
[506](https://github.com/hyoklee/vol-daos/actions/runs/4137564959/jobs/7152893462#step:8:507)
 "env" "D_LOG_FILE=/home/runner/work/vol-daos/vol-daos/build/test/daos_client.log" "D_LOG_MASK=ERR" "CRT_PHY_ADDR_STR=ofi+sockets" "OFI_INTERFACE=lo" "DAOS_AGENT_DRPC_DIR=/home/runner/work/vol-daos/vol-daos/build/test" "HDF5_PLUGIN_PATH=/home/runner/work/vol-daos/vol-daos/build/bin" "HDF5_VOL_CONNECTOR=daos" "HDF5_DAOS_CHUNK_TARGET_SIZE=1073741824" "/home/runner/work/vol-daos/vol-daos/build/test/scripts/daos_pool.sh"
[507](https://github.com/hyoklee/vol-daos/actions/runs/4137564959/jobs/7152893462#step:8:508)
H5VLTestDriver: starting process server
[508](https://github.com/hyoklee/vol-daos/actions/runs/4137564959/jobs/7152893462#step:8:509)
-------------- server output --------------
[509](https://github.com/hyoklee/vol-daos/actions/runs/4137564959/jobs/7152893462#step:8:510)
/usr/bin/mpiexec: Error: unknown option "-genv"
[510](https://github.com/hyoklee/vol-daos/actions/runs/4137564959/jobs/7152893462#step:8:511)
Type '/usr/bin/mpiexec --help' for usage.
[511](https://github.com/hyoklee/vol-daos/actions/runs/4137564959/jobs/7152893462#step:8:512)
H5VLTestDriver: server never started.
[512](https://github.com/hyoklee/vol-daos/actions/runs/4137564959/jobs/7152893462#step:8:513)
H5VLTestDriver: Server never started.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.