Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 4 Current »

Test Configuration

Tip of master, commit cc261eb9940cfe8b34bf5a25553143a6c7ce787e

All tests run with ofi+psm2, ib0.

daos_test: Run with 8 server (boro-[4-11]), 2 client (boro-[12-13]). Killed servers, cleaned /mnt/daos in between runs listed below.

Tests requiring pool to be created via dmg used 4GB pool. These used boro-12 as client.

mpich tests used boro-4 as server, boro-12 as client, with a 1GB pool.

daosbench and daos_perf were both run with servers with DAOS_IMPLICIT_PURGE=1.

Test Results

daos_test

Separate runs with cleanup in between:

  • -mpcCiAeoRd - PASS
  • -r - PASS

daosperf w/DAOS_IMPLICIT_PURGE

1K Records

CREDITS=1

[sdwillso@boro-13 ~]$ CREDITS=1 ./daos/daos_m/src/tests/daos_perf.sh daos 200 1000 1K
+ /home/sdwillso/daos/daos_m/opt/ompi/bin/orterun -quiet --hostfile /home/sdwillso/scripts/host.cli.1 --ompi-server file:/home/sdwillso/scripts/uri.txt -x DD_SUBSYS= -x DD_MASK= -x D_LOG_FILE=/tmp/daos_perf.log /home/sdwillso/daos/daos_m/install/bin/daos_perf -T daos -P 2G -d 1 -a 200 -r 1000 -s 1K -C 1 -t -z
Test :
	DAOS (full stack)
Parameters :
	pool size     : 2048 MB
	credits       : 1 (sync I/O for -ve)
	obj_per_cont  : 1 x 8 (procs)
	dkey_per_obj  : 1
	akey_per_dkey : 200
	recx_per_akey : 1000
	value type    : single
	value size    : 1024
	zero copy     : yes
	overwrite     : yes
	VOS file      : <NULL>
Started...
update successfully completed:
	duration : 5.684471   sec
	bandwith : 274.872    MB/sec
	rate     : 281468.59  IO/sec
	latency  : 3.553      us (nonsense if credits > 1)
Duration across processes:
MAX duration : 5.684348   sec
MIN duration : 3.698779   sec
Average duration : 4.774698   sec

CREDITS=8

[sdwillso@boro-12 ~]$ CREDITS=8 ./daos/daos_m/src/tests/daos_perf.sh daos 200 1000 1K
+ /home/sdwillso/daos/daos_m/opt/ompi/bin/orterun -quiet --hostfile /home/sdwillso/scripts/host.cli.1 --ompi-server file:/home/sdwillso/scripts/uri.txt -x DD_SUBSYS= -x DD_MASK= -x D_LOG_FILE=/tmp/daos_perf.log /home/sdwillso/daos/daos_m/install/bin/daos_perf -T daos -P 2G -d 1 -a 200 -r 1000 -s 1K -C 8 -t -z
Test :
	DAOS (full stack)
Parameters :
	pool size     : 2048 MB
	credits       : 8 (sync I/O for -ve)
	obj_per_cont  : 1 x 8 (procs)
	dkey_per_obj  : 1
	akey_per_dkey : 200
	recx_per_akey : 1000
	value type    : single
	value size    : 1024
	zero copy     : yes
	overwrite     : yes
	VOS file      : <NULL>
Started...
update successfully completed:
	duration : 4.556007   sec
	bandwith : 342.954    MB/sec
	rate     : 351184.72  IO/sec
	latency  : 2.848      us (nonsense if credits > 1)
Duration across processes:
MAX duration : 4.555831   sec
MIN duration : 2.377609   sec
Average duration : 3.555531   sec

4K Records

CREDITS=1

[sdwillso@boro-12 ~]$ CREDITS=1 ./daos/daos_m/src/tests/daos_perf.sh daos 200 1000 4K
+ /home/sdwillso/daos/daos_m/opt/ompi/bin/orterun -quiet --hostfile /home/sdwillso/scripts/host.cli.1 --ompi-server file:/home/sdwillso/scripts/uri.txt -x DD_SUBSYS= -x DD_MASK= -x D_LOG_FILE=/tmp/daos_perf.log /home/sdwillso/daos/daos_m/install/bin/daos_perf -T daos -P 2G -d 1 -a 200 -r 1000 -s 4K -C 1 -t -z
Test :
	DAOS (full stack)
Parameters :
	pool size     : 2048 MB
	credits       : 1 (sync I/O for -ve)
	obj_per_cont  : 1 x 8 (procs)
	dkey_per_obj  : 1
	akey_per_dkey : 200
	recx_per_akey : 1000
	value type    : single
	value size    : 4096
	zero copy     : yes
	overwrite     : yes
	VOS file      : <NULL>
Started...
update successfully completed:
	duration : 9.923064   sec
	bandwith : 629.846    MB/sec
	rate     : 161240.52  IO/sec
	latency  : 6.202      us (nonsense if credits > 1)
Duration across processes:
MAX duration : 9.922652   sec
MIN duration : 6.553026   sec
Average duration : 8.238660   sec

IOR, 2 client 10GB pool, data verification enabled

[sdwillso@boro-3 ~]$ orterun -np 1 --hostfile ~/hostlists/daos_client_hostlist --mca mtl ^psm2,ofi  --ompi-server file:~/scripts/uri.txt ior -v -W -i 5 -a DAOS -w -o `uuidgen` -b 5g -t 1m -O daospool=2930c2b8-689f-4ca4-bf36-3261f28196c4,daosrecordsize=1m,daosstripesize=1m,daosstripecount=1024,daosaios=16,daosobjectclass=LARGE,daosPoolSvc=1,daosepoch=1
IOR-3.0.1: MPI Coordinated Test of Parallel I/O

Began: Fri Jun  1 21:55:51 2018
Command line used: ior -v -W -i 5 -a DAOS -w -o 4124e0ed-05dd-4a30-a0ee-9ab4b27be4cd -b 5g -t 1m -O daospool=2930c2b8-689f-4ca4-bf36-3261f28196c4,daosrecordsize=1m,daosstripesize=1m,daosstripecount=1024,daosaios=16,daosobjectclass=LARGE,daosPoolSvc=1,daosepoch=1
Machine: Linux boro-12.boro.hpdd.intel.com
Start time skew across all tasks: 0.00 sec

Test 0 started: Fri Jun  1 21:55:51 2018
Path: /home/sdwillso
FS: 3.8 TiB   Used FS: 10.1%   Inodes: 250.0 Mi   Used Inodes: 2.0%
Participating tasks: 1
[0] WARNING: USING daosStripeMax CAUSES READS TO RETURN INVALID DATA
Summary:
	api                = DAOS
	test filename      = 4124e0ed-05dd-4a30-a0ee-9ab4b27be4cd
	access             = single-shared-file, independent
	pattern            = segmented (1 segment)
	ordering in a file = sequential offsets
	ordering inter file= no tasks offsets
	clients            = 1 (1 per node)
	repetitions        = 5
	xfersize           = 1 MiB
	blocksize          = 5 GiB
	aggregate filesize = 5 GiB

access    bw(MiB/s)  block(KiB) xfer(KiB)  open(s)    wr/rd(s)   close(s)   total(s)   iter
------    ---------  ---------- ---------  --------   --------   --------   --------   ----
Commencing write performance test: Fri Jun  1 21:55:51 2018
write     5189       5242880    1024.00    0.001027   0.983459   0.002235   0.986745   0   
Verifying contents of the file(s) just written.
Fri Jun  1 21:55:52 2018

remove    -          -          -          -          -          -          0.002549   0   
Commencing write performance test: Fri Jun  1 21:55:56 2018
write     5616       5242880    1024.00    0.000647   0.908628   0.002421   0.911709   1   
Verifying contents of the file(s) just written.
Fri Jun  1 21:55:57 2018

remove    -          -          -          -          -          -          0.002353   1   
Commencing write performance test: Fri Jun  1 21:56:00 2018
write     5624       5242880    1024.00    0.000629   0.907809   0.002001   0.910453   2   
Verifying contents of the file(s) just written.
Fri Jun  1 21:56:01 2018

remove    -          -          -          -          -          -          0.002340   2   
Commencing write performance test: Fri Jun  1 21:56:04 2018
write     5622       5242880    1024.00    0.000610   0.908072   0.001986   0.910678   3   
Verifying contents of the file(s) just written.
Fri Jun  1 21:56:05 2018

remove    -          -          -          -          -          -          0.002270   3   
Commencing write performance test: Fri Jun  1 21:56:08 2018
write     5583       5242880    1024.00    0.000642   0.913964   0.002392   0.917011   4   
Verifying contents of the file(s) just written.
Fri Jun  1 21:56:09 2018

remove    -          -          -          -          -          -          0.002314   4   

Max Write: 5623.58 MiB/sec (5896.75 MB/sec)

Summary of all tests:
Operation   Max(MiB)   Min(MiB)  Mean(MiB)     StdDev    Mean(s) Test# #Tasks tPN reps fPP reord reordoff reordrand seed segcnt blksiz xsize aggsize API RefNum
write        5623.58    5188.78    5526.74     169.61    0.92732 0 1 1 5 0 0 1 0 0 1 5368709120 1048576 5368709120 DAOS 0

Finished: Fri Jun  1 21:56:15 2018

daos_bench w/DAOS_IMPLICIT_PURGE

kv-idx-update

kv-dkey-update

kv-akey-update

kv-dkey-fetch

kv-akey-fetch

mpich tests

Results: No failures seen

  • No labels