Bug fixes and misc. code fixing.
Description:
Updated to reflect current DDL document. Also changed VL data to be able
to be able also dump VL data of any other datatype.
Platforms tested:
FreeBSD 4.1.1 (hawkwind)
Bug fix
Description:
The code for determining what a string should print out if it was
declared as H5T_STR_NULLTERM (C strings), H5T_STR_SPACEPAD
(Fortran strings), or H5T_STR_NULLPAD (print null characters to
the end of the line) was mucked. A user had a problem with it and
suggested a change, but the change didn't seem to work properly.
Also, if the string was H5T_STR_SPACEPAD, it could have stopped
when encountering a NULL even if it hadn't gotten to the end of
the string.
Solution:
Reworked the code to make it more understandable what's happening
and to add in support for H5T_STR_SPACEPAD which may have been
missing before.
Platforms tested:
Linux
Bug fix
Description:
If TESTH5TOH4 or TESTH4TOH5 wasn't set, then it would mess up the
testing process by adding a ``#'' to the for statement (since
there was a line-continuation in the definition of the macro).
Solution:
Placed the macro expansions all on one line so that there's no
junk characters picked up by accident.
Platforms tested:
OSF1 (Gondolin)
Bug fix
Description:
With the changing of the spelling error in the h5ls usage
statement, these testfiles puked.
Solution:
Corrected them in there.
Platforms tested:
Linux
Code cleaning
Description:
Use the variable to rid the warnings of "variable set but not used".
Platforms tested:
Linux and modi4 -64. Just compile only.
Bug fix
Description:
The output of floating point dumps wasn't necessarily standard.
The h5ls utility does it in a better way.
Solution:
Changed the output parameters from %g to %1.*g and added the
appropriate FLT_DIG/DBL_DIG parameter for the `*' in the above.
Platforms tested:
Linux
Bug fix
Description:
I was writing things out to the HDF file in big-endian format
without doing any conversions on the data or anything like that.
This was causing tests to fail on most machines
Solution:
Removed the big-endian craziness...But, this kinda resulted in a
bug in the HDF dumper which Albert and Robb suggested ways of
fixing.
Platforms tested:
Linux
pathc
Description:
TFLOPS coredumped on h5ls because when h5ls calls ioctl(TIOCGWINSZ),
the ioctl() routine coredumped inside.
Solution:
It is likely that TFLOPS does not support window-size in the
compute nodes. Still, it should not coredump. Bypass it for
TFLOPS for now.
Platforms tested:
TFLOPS & modi4 (-64)
Bug fix
Description:
[Most] SGIs failed on dumping VL data during the daily tests they
seem to be initializing variables differently, exposing a bug in dumping
datasets with scalar dataspaces.
Also, clean up code to get rid of compiler warnings.
Solution:
Initilialize variable correctly.
Platforms tested:
SGI IRIX 6.5 (paz)
Bug fix.
Description:
VL datatype dumping was not working correctly on most machines because
the "native" version of the variable-length type wasn't being generated
for the printing process.
Re-enabled VL dumping test
Solution:
Generate "native" version of the VL datatype to read in for printing,
also some code cleanup in the VL dumping algorithm.
Platforms tested:
FreeBSD 4.1.1 (hawkwind) & Solaris 2.6 (baldric)
Feature
Description:
Variable length data dumping for simple (i.e., either SCALAR or
1-Dim array variable length datatypes) should work. Added to the
tests...
Platforms tested:
Linux
Updated for new array datatypes.
Description:
I missed these tools earlier when I wasn't compiling with HDF4...
Solution:
Updated them (correctly, I hope) to use the new array datatype instead of
compound datatype array fields.
Platforms tested:
FreeBSD 4.1.1 (hawkwind)
Recoding of VL dumping.
Description:
I'm using hyperslabs to select the variable length data. I don't
have any tests which I've checked in just yet. I'll create those
later.
Platforms tested:
Linux
Purpose:
check in beta release h4toh5 converter
Description:
1. add copy right and other comments to all .h and .c files
2. fix bugs on sds unlimited dimension, hdf5 dimensional scale attribute and vdata translating
from h4 to h5.
Solution:
2.
1) for sds with unlimited dimension to be converted into extensible hdf5 dataset,
on hdf5 side, has to set a default chunk size even though the corresponding hdf4 file is not
chunked.
2) in this version, if sds object doesn't have dimensional scale data, we will not show
the default hdf4 dimensional name ("fakedim0", etc.) in the new hdf5 dimensional scale
name attribute.
3) fix a bug transferring vdata of which the field includes a character array. Make it correctly transfer into the corresponding hdf5 compound data type.
Platforms tested:
eirene,arabica,baldric,hawkwind,paz,gondolin on new set of hdf4 test files.
Purpose:
Adding more expected files for testing h4toh5 converter
These test files include hdf5 files that are expected converted from hdf4 files for various vdata
and vgroup cases.
Description:
Solution:
Platforms tested:
eirene,arabica,hawkwind,paz
Purpose:
Add testing files for h4toh5 converter
Two more files for testing native float data type sds objects
Description:
Solution:
Platforms tested:
arabica,eirene,hawkwind,paz
Purpose:
Adding expected files for h4toh5 converter
two test files for annotation
Description:
Solution:
Platforms tested:
arabica,eirene,paz,gondolin,hawkwind
Purpose:
Adding testing files for h4toh5 converter
hdf4 test files for various tests on vgroup(including, hardlink,loop,nameclashing etc.)
Description:
Solution:
Platforms tested:
arabica,eirene,paz,gondolin,hawkwind
Purpose:
Adding testfiles for h4toh5 converter
more testing files for sds objects in various datatypes
Description:
see above
Solution:
Platforms tested:
eirene,arabica,gondolin,paz,hawkwind
Purpose:
Adding testing files for h4toh5 converter
these files are parts of files that test different datatype sds objects.
Description:
see above
Solution:
Platforms tested:
eirene,arabica,gondolin,paz,hawkwind
Purpose:
Add testing files for h4toh5 converter
test files for testing dimensional scale dataset and unlimited dimension case
Description:
see above
Solution:
Platforms tested:
eirene,arabica,hawkwind,paz,gondolin
Purpose:
add several files to test h4toh5 converter on hdf image with different datatype
Description:
see above
Solution:
Platforms tested:
eirene,hawkwind,arabica
Purpose:
Add two test files for testing h4toh5 converter to convert image objects associated with attribute and image.
Description:
see above
Solution:
Platforms tested:
arabica,eirene,hawkwind
Purpose:
add a hdf4 file to test converter for converter object annotation into hdf5 attribute
Description:
see above
Solution:
Platforms tested:
eirene,hawkwind,arabica
Purpose:
add more hdf4 test files for converter
Description:
in this script, run CONVERT test files
Solution:
Add CONVERT h4file h5file in this script
Platforms tested:
eirene,hawkwind,arabica
"bug fix"
Description:
the h5ls tests sometimes failed because the tool sometimes prints
the tool name in the help message as "lt-h5ls" while the
expected result has it as "h5ls". This was the tools pulled its
name from argv[0].
Solution:
Hardcode the progname as "h5ls" and blocked out the code that
pulls the program name from argv[0]. The better solution is
to find a way to create the real binary with the orginal tool
name.
Platforms tested:
linux
new feature
Description:
h5tools.c:
Created h5tools_init() and h5tools_close() for the initialization
of the h5tools library and closing of it. With this, the rawdatastream
and other internal structures can be initialized properly.
h5tools.h:
added prototypes for h5tools_init and h5tools_close.
h5dump.c:
h5ls.c:
Added the calls for h5tools_init() and h5tools_close().
Platforms tested:
IRIX64 -64 parallel and Linux
* 2000-10-31 Robb Matzke <matzke@llnl.gov> (main)
Added calls to MPI_Init() and MPI_Finalize() for parallel
versions in order to prevent errors about unrecognized
command-line options.