Bug fix
Description:
The output of floating point dumps wasn't necessarily standard.
The h5ls utility does it in a better way.
Solution:
Changed the output parameters from %g to %1.*g and added the
appropriate FLT_DIG/DBL_DIG parameter for the `*' in the above.
Platforms tested:
Linux
Bug fix
Description:
I was writing things out to the HDF file in big-endian format
without doing any conversions on the data or anything like that.
This was causing tests to fail on most machines
Solution:
Removed the big-endian craziness...But, this kinda resulted in a
bug in the HDF dumper which Albert and Robb suggested ways of
fixing.
Platforms tested:
Linux
pathc
Description:
TFLOPS coredumped on h5ls because when h5ls calls ioctl(TIOCGWINSZ),
the ioctl() routine coredumped inside.
Solution:
It is likely that TFLOPS does not support window-size in the
compute nodes. Still, it should not coredump. Bypass it for
TFLOPS for now.
Platforms tested:
TFLOPS & modi4 (-64)
Bug fix
Description:
[Most] SGIs failed on dumping VL data during the daily tests they
seem to be initializing variables differently, exposing a bug in dumping
datasets with scalar dataspaces.
Also, clean up code to get rid of compiler warnings.
Solution:
Initilialize variable correctly.
Platforms tested:
SGI IRIX 6.5 (paz)
Bug fix.
Description:
VL datatype dumping was not working correctly on most machines because
the "native" version of the variable-length type wasn't being generated
for the printing process.
Re-enabled VL dumping test
Solution:
Generate "native" version of the VL datatype to read in for printing,
also some code cleanup in the VL dumping algorithm.
Platforms tested:
FreeBSD 4.1.1 (hawkwind) & Solaris 2.6 (baldric)
Feature
Description:
Variable length data dumping for simple (i.e., either SCALAR or
1-Dim array variable length datatypes) should work. Added to the
tests...
Platforms tested:
Linux
Updated for new array datatypes.
Description:
I missed these tools earlier when I wasn't compiling with HDF4...
Solution:
Updated them (correctly, I hope) to use the new array datatype instead of
compound datatype array fields.
Platforms tested:
FreeBSD 4.1.1 (hawkwind)
Recoding of VL dumping.
Description:
I'm using hyperslabs to select the variable length data. I don't
have any tests which I've checked in just yet. I'll create those
later.
Platforms tested:
Linux
Purpose:
check in beta release h4toh5 converter
Description:
1. add copy right and other comments to all .h and .c files
2. fix bugs on sds unlimited dimension, hdf5 dimensional scale attribute and vdata translating
from h4 to h5.
Solution:
2.
1) for sds with unlimited dimension to be converted into extensible hdf5 dataset,
on hdf5 side, has to set a default chunk size even though the corresponding hdf4 file is not
chunked.
2) in this version, if sds object doesn't have dimensional scale data, we will not show
the default hdf4 dimensional name ("fakedim0", etc.) in the new hdf5 dimensional scale
name attribute.
3) fix a bug transferring vdata of which the field includes a character array. Make it correctly transfer into the corresponding hdf5 compound data type.
Platforms tested:
eirene,arabica,baldric,hawkwind,paz,gondolin on new set of hdf4 test files.
Purpose:
Adding more expected files for testing h4toh5 converter
These test files include hdf5 files that are expected converted from hdf4 files for various vdata
and vgroup cases.
Description:
Solution:
Platforms tested:
eirene,arabica,hawkwind,paz
Purpose:
Add testing files for h4toh5 converter
Two more files for testing native float data type sds objects
Description:
Solution:
Platforms tested:
arabica,eirene,hawkwind,paz
Purpose:
Adding expected files for h4toh5 converter
two test files for annotation
Description:
Solution:
Platforms tested:
arabica,eirene,paz,gondolin,hawkwind
Purpose:
Adding testing files for h4toh5 converter
hdf4 test files for various tests on vgroup(including, hardlink,loop,nameclashing etc.)
Description:
Solution:
Platforms tested:
arabica,eirene,paz,gondolin,hawkwind
Purpose:
Adding testfiles for h4toh5 converter
more testing files for sds objects in various datatypes
Description:
see above
Solution:
Platforms tested:
eirene,arabica,gondolin,paz,hawkwind
Purpose:
Adding testing files for h4toh5 converter
these files are parts of files that test different datatype sds objects.
Description:
see above
Solution:
Platforms tested:
eirene,arabica,gondolin,paz,hawkwind
Purpose:
Add testing files for h4toh5 converter
test files for testing dimensional scale dataset and unlimited dimension case
Description:
see above
Solution:
Platforms tested:
eirene,arabica,hawkwind,paz,gondolin
Purpose:
add several files to test h4toh5 converter on hdf image with different datatype
Description:
see above
Solution:
Platforms tested:
eirene,hawkwind,arabica
Purpose:
Add two test files for testing h4toh5 converter to convert image objects associated with attribute and image.
Description:
see above
Solution:
Platforms tested:
arabica,eirene,hawkwind
Purpose:
add a hdf4 file to test converter for converter object annotation into hdf5 attribute
Description:
see above
Solution:
Platforms tested:
eirene,hawkwind,arabica
Purpose:
add more hdf4 test files for converter
Description:
in this script, run CONVERT test files
Solution:
Add CONVERT h4file h5file in this script
Platforms tested:
eirene,hawkwind,arabica
"bug fix"
Description:
the h5ls tests sometimes failed because the tool sometimes prints
the tool name in the help message as "lt-h5ls" while the
expected result has it as "h5ls". This was the tools pulled its
name from argv[0].
Solution:
Hardcode the progname as "h5ls" and blocked out the code that
pulls the program name from argv[0]. The better solution is
to find a way to create the real binary with the orginal tool
name.
Platforms tested:
linux
new feature
Description:
h5tools.c:
Created h5tools_init() and h5tools_close() for the initialization
of the h5tools library and closing of it. With this, the rawdatastream
and other internal structures can be initialized properly.
h5tools.h:
added prototypes for h5tools_init and h5tools_close.
h5dump.c:
h5ls.c:
Added the calls for h5tools_init() and h5tools_close().
Platforms tested:
IRIX64 -64 parallel and Linux
* 2000-10-31 Robb Matzke <matzke@llnl.gov> (main)
Added calls to MPI_Init() and MPI_Finalize() for parallel
versions in order to prevent errors about unrecognized
command-line options.
* 2000-10-31 Robb Matzke <matzke@llnl.gov> (dump_dataset_values)
The `float' and `double'
values are displayed with the full precision instead of just the
default `%g'.
Purpose:
Testing
Description:
The h5ls test script only looked at exit status.
Solution:
Created expected output files and compare actual output
with expected output.
Platforms tested:
i686-pc-linux
* 2000-10-31 Robb Matzke <matzke@llnl.gov> (verbose)
Compares output to expected
files. This should work just fine because we're not using the `-v'
option which prints datatypes in a machine-dependent way.
* 2000-10-31 Robb Matzke <matzke@llnl.gov> (h5dump_sprint)
The whitespace added for
indentation after the line-feed kludge is performed only if a
line-feed was actually inserted. This fixes funny-looking h5ls
output that had ` %s' sequences appearing in nested compound
datatypes.
Also added a prominent warning in the code to indicate that when a
line-feed is inserted into the string that column number
calculations will be incorrect and object indices will be missing.
Bug Fix
Description:
People need to type in the full-path to the
attribute/dataset/etc. and weren't being told to do so by the
"usage" statement.
Solution:
Added an example and changed <names> to <path> to be more
explicit...