mirror of
https://github.com/HDFGroup/hdf5.git
synced 2024-12-03 02:32:04 +08:00
c47f724187
Improvement Description: The stdout and stderr were both redirected to an output file. This works fine in tradition sequential Unix machines. But in some parallel systems (like mpi-jobs in IBM SP), the stderr is merged with stdout alright but not in the exact order as expected. This is not deterministic in parallel jobs. So, the test output are all there but the ordering maynot be as expected. Solution: Redirect stderr to separated file and append it to the stdout file after test-command is executed. Then compare it with the expected output. This eliminate the assumption that stdout and stderr must merged in "chronical orders". The .ddl file are updated by moving all stderr text to the end of the file. Platforms tested: eirene.
38 lines
641 B
SQL
38 lines
641 B
SQL
#############################
|
|
Expected output for 'h5dump -t /#5992:0 -g /group2 tcompound.h5'
|
|
#############################
|
|
HDF5 "tcompound.h5" {
|
|
DATATYPE "/#5992:0"
|
|
|
|
GROUP "/group2" {
|
|
DATASET "dset5" {
|
|
DATATYPE "/#6632:0"
|
|
|
|
DATASPACE SIMPLE { ( 5 ) / ( 5 ) }
|
|
DATA {
|
|
{
|
|
0,
|
|
0
|
|
},
|
|
{
|
|
1,
|
|
0.1
|
|
},
|
|
{
|
|
2,
|
|
0.2
|
|
},
|
|
{
|
|
3,
|
|
0.3
|
|
},
|
|
{
|
|
4,
|
|
0.4
|
|
}
|
|
}
|
|
}
|
|
}
|
|
}
|
|
h5dump error: unable to open datatype "/#5992:0"
|