Purpose:
Bug fix
Description:
On NERSC SP3 configure failed while trying to figure out
how to print long long.
Solution:
Added the following line
hdf5_cv_printf_ll=${hdf5_cv_printf_ll='ll'}
Platforms tested:
Not tested yet.
Purpose:
Fix Bill's "Major Hack" for NERSC seaborg machine
Description:
Bill's fix was based on the machine name. NERSC SP3 gseaborg
was renamed to seaborg.
Solution:
Fixed the name.
Platforms tested:
Not tested yet.
Update docs
Description:
H5Dread and H5Dwrite description contained some inaccurate information about
how H5S_ALL works as a parameter for the memory and file dataspaces.
Solution:
Updated information to reflect current library behavior for H5S_ALL.
Code cleanup
Description:
Purify detected an uninitialized memory read in test data.
Solution:
Corrected parameters for initializing data array so entire array is
initialized.
Platforms tested:
Solaris 2.7 (arabica)
Code cleanup
Description:
Purify detected some resource leaks in the tests.
Solution:
Released memory and property lists properly.
Platforms tested:
Solaris 2.7 (arabica)
Bug Fix
Description:
Purify detected an uninitialized memory read in H5Pset_chunk and a memory
leak in H5P_remove.
Solution:
Patched both up.
Platforms tested:
Solaris 2.7 (arabica)
Purpose:
refix tconfig.c
Description:
Follow Robb's reminder, long_long is used to define __int64 in windows and long long for other platforms at H5private.h.
Solution:
just change vrfy_ctype(long long....) into vrfy(long_long,.....) in the tconfig.c. Delete the previous
macro.
Platforms tested:
windows 2000, linux
Purpose:
add new information and delete old windows and h4toh5 information on release.txt.
Description:
1. add a note to mention release dll will work after installing service pack 5 of VS6.0.
2. delete the fixed bugs on windows(libc.lib warnings and h4toh5 image handlings)
Solution:
Platforms tested:
Purpose:
bug fixed
Description:
Windows doesn't recognize long long. Instead it uses __int64. So add a macro
like
#ifdef HAVE____int64 for windows-like platforms.
Solution:
see above
Platforms tested:
eirene
Purpose:
Fixed bugs
Description:
1. tconfig.c finds size of long double and size of off_t are not correct on windows 2000.
Size of long double(8) at the manually hacked H5config.h on windows represents windows NT 4.0.
On windows 2000, it is set to 12. Now H5config.h is fixed to pass windows 2000, but will fail on NT4.0O. This problem needs to be addressed.
2. modify testhdf5 and testhdf5sll projects to fit in the new test.
3. find a release dll bug(cause tattr test failed), later the bug is gone after install VS 6.0 service pack 5. Highly suspect it is a compiler bug.
Solution:
See above.
Platforms tested:
windows 2000
Bug fix
Description:
Remove 'const' modifier in prototype for H5D_new, the dcpl_id parameter
needs to be non-const.
Platforms tested:
Eyeballed (reported on gondolin)
feature
Description:
Recognize command line argument in the form of '--*' as
a configure command option by default. Since all normal
configure options are in the form of '--*', this will
simplify the "op-configure <option>" syntax. The latter
syntax is still kept in case one would want to pass in
some configure argument that does not fit this syntax.
Platforms tested:
Eirene.
Test bug fix.
Description:
When reading or writing to chunked datasets and the data needed datatype
conversion, and the amount of data was more than one conversion buffer,
data in the conversion buffer was getting corrupted.
Solution:
Corrected error in advancing buffer pointer where it was being advanced
by the number of elements instead of the number of bytes.
Platforms tested:
FreeBSD 4.4 (hawkwind)
Document bug fix.
Description:
When reading or writing to chunked datasets and the data needed datatype
conversion, and the amount of data was more than one conversion buffer,
data in the conversion buffer was getting corrupted.
Solution:
Corrected error in advancing buffer pointer where it was being advanced
by the number of elements instead of the number of bytes.
Platforms tested:
FreeBSD 4.4 (hawkwind)
Bug fix/optimization.
Description:
Single, contiguous (in memory) hyperslabs are able to be transferred in one
I/O operation, but weren't being detected correctly by the code in
H5S_all_read()/H5S_all_write() and were getting routed into slower I/O
routines. (Or, possibly failing in some circumstances)
Solution:
Wrote code to correctly detect single contiguous hyperslabs in memory and
adjust arrays and buffer pointers describing the memory information so that
the entire hyperslab can be transferred in one operation.
Platforms Tested:
FreeBSD 4.4 (hawkwind)
'Bug fix'
Description:
When testing the validity of zlib, the compress() function is
used. HDF5 actually requires a newer version of zlib which
contains compress2(). Compress2 is tested in later part of
the configure. This caused redundent tests and confusion too.
Solution:
Changed zlib test to look for compress2() instead.
Older version of HDF4 libraries (e.g. 4.1r2) would fail this
test correctly. This eliminated the possibility of using an
older version of HDF4 without using zlib compression in HDF5.
But since we need newer version of hdp (with loops detection),
the older version hdf4 is not old any more.
Remark: the compress2 test is not removed. After this change
proven working correctly for all platforms, the extra compress2
test can be removed and source code must be updated too.
Platforms tested:
modi4: tested with hdf4.1r2 and failed as expected. Tested with
newer hdf4 libraries and succeeded as expected.
Purpose:
Final Fantasy...er...fix
Description:
I fixed the problem with the summary printing newlines when we didn't
want it to when using "ksh".
Solution:
There's a flag '\c' which should be used at the end of the line if
you can't use the '-n' flag.
Platforms tested:
linux and modi4.
Bug fix.
Problem:
When an entire dataset was selected (through whatever means, H5S_ALL, making
an explicit "all" selection, etc.), the code was not allowing the optimized
routine to read the entire dataset in at once when the current dimensions
did not match the maximum dimensions and instead was defaulting to a [much]
slower method to read in the dataset.
Solution:
Took out check which was requiring current dimensions to be equal to the
maximum dimensions.
Platforms tested:
FreeBSD 4.4 (hawkwind)
Bug Fix
Description:
When writing (or reading) the entire dataset to a chunked dataset, there
was a boundary case where the code to generate the description of the
piece of the dataset to read into the buffer for data conversion would
attempt to read off the boundary of the dataset. This was occuring because
the code to detect the edge of the dataset was not propagating the change
up through the remaining dimensions when an edge in a fast changing
dimension was detected.
Solution:
Propagate edge detection up through slower changing dimensions properly.
Platforms tested:
Linux 2.2.18smp (eirene)
Fix on Kludge
Description:
Forgot another chunk of parallel I/O code that needed to change for the
generic property list kludge... :-/
Platforms tested:
Parallel Linux 2.2.18smp (eirene)
Purpose:
Bug Fix
Description:
Some platforms don't handle echo -n correctly.
Solution:
Copied some code from the configure which determines which flag to
give echo.
Platforms tested:
Linux
Kludge
Description:
Since we're only about halfway through converting the internal use of
property lists from the "old way" to the generic property lists, we turned
off snapshots to avoid exposing lots of API changes to users, until the
APIs settled down.
Getting the snapshots rolling again seems to have become a priority, so
some changes are going to have to be made now that were going to be
postponed until we were completely finished with the conversion. This
requires that the old API functions be able to deal with both the old
and new property lists smoothly.
Solution:
Kludge together the property list code so that they can transparently handle
dealing with both the old and new property lists
Platforms tested:
FreeBSD 4.4 (hawkwind)
New feature.
Description:
Added a test to verify the correctness of information provided by
configure in H5config.h. Some information, such as SIZEOF some
types can be hardcoded by config/<machine>. This test verified
the information is indeed correct.
Currenly, only size of C language types are verified.
Platforms tested:
Eirene, regular, arabica.
Code cleanup for better compatibility with C++ compilers
Description:
C++ compilers are choking on our C code, for various reasons:
we used our UNUSED macro incorrectly when referring to pointer types
we used various C++ keywords as variables, etc.
we incremented enum's with the ++ operator.
Solution:
Changed variables, etc.to avoid C++ keywords (new, class, typename, typeid,
template)
Fixed usage of UNUSED macro from this:
char UNUSED *c
to this:
char * UNUSED c
Switched the enums from x++ to x=x+1
Platforms tested:
FreeBSD 4.4 (hawkwind)
Purpose:
Refix
Description:
Changed
if test -d $1; then
:
else
to
if test ! -d $1; then
since "test ! -d" should work on all platforms and is a much cleaner
solution than the original.
Platforms tested:
Linux
Purpose:
Feature add
Description:
Changed the "make install" thingy to "make install-all" in the
quick-setup guide. Also, changed the version number of HDF5 in the
examples from 1.4.0 and 1.2.0 to 1.5.x
Purpose:
Feature Add
Description:
Added "install-example" and "install-all" to the Makefile system.
The behaviour of the "make install*" options:
make install - Installs binaries, libraries, include
files, and example programs.
make install-examples - Installs only the example programs.
The directories are:
${prefix}/doc/hdf5/examples/{c,c++,fortran}
make install-all - Install the binaries, libraries, include
files, example programs, and
documentation. The whole kit-n'-caboodle.
make uninstall-examples - Get rid of those example files (but not
the ${prefix}/doc/hdf5/examples/...
directories)
There's a new bin/ program which helps create directories which are
deeply nested called "mkdirs". It's a simple shell script.
Platforms tested:
Linux
Purpose:
fix a bug
Description:
In precondition 3, winzip will unzip hdf5xxx.zip into the directory
hdf5xxx and users should rename hdf5xxx into hdf5 to correctly build
HDF5 library.
Solution:
correct the sentence in precondition 3.
Platforms tested:
windows 2000
Purpose:
Bug Fix
Description:
The error codes checked for were hardcoded into the program.
Solution:
Used the "enum" names instead.
Platforms tested:
Linux