bug fix.
Description:
The previous patch of -D__GNUC__ was causing failure in the
newer compiler. The previous failure could not be repeated
any more. So, removed it.
Platforms tested:
Tested only in TG-NCSA since the change affects only the ia64 platform.
Misc. update:
Bug fix
Description:
It uses the value of $ARCH as a gcc option but the linux clusters
at NCSA define $ARCH as environment variable with values that are
not a valid compiler option. That caused the configure to fail
because it was not able to compile at all.
Solution:
Change ARCH to lower case $arch (convention dictates environment
variables are upper cases.) Also preset $arch to null and do not
honor any pass it values.
Platforms tested:
Attempted to run h5committest but sol was failing due to /tmp
filled. Copper and verbena passed. Also passed in TG-NCSA.
Misc. update:
bug fix, new feature
Description:
fixed bug in the parse function:
cases where we have an already inserted name but there is a new name also
example:
-f dset1:GZIP=1 -l dset1,dset2:CHUNK=20x20
dset1 is already inserted, but dset2 must also be (it was not)
added a CHECK_SZIP symbol to enable/disable checking of library related szip parameters
added the print of the filter name in verbose mode (confirms visually that the filter was applied )
Solution:
Platforms tested:
linux
solaris
AIX
Misc. update:
Purpose:
Bug fix
Description:
Replaced "unsigned long long" with hsize_t in H5MF
Added "return 0" at end of reserved.c test
Platforms tested:
arabica, sleipnir
Purpose: New feature
Description: New API H5Tencode and H5Tdecode. Given object ID, H5Tencode encodes object information into a binary form. H5Tdecode decode an object information in a binary form, reconstructs the object and return a new object ID.
Solution: Use object header functions H5O_dtype_decode and H5O_dtype_encode to
facilitate them. The encoded binary is exactly like object header information.
This is the first step checkin. Will check in H5Sencode and H5Sdecode later.
Platforms tested: h5committed and fuss.
Misc. update: will update release.txt after 2nd step checkin.
Purpose:
Bug Fix
Description:
If an HDF5 file grows larger than its address space, it dies and is unable to
write any data. This is more likely to happen since users are able to change
the number of bytes used to store addresses in the file.
Solution:
HDF5 now throws an error instead of dying. In addition, it "reserves" address
space for the local heap and for object headers (which do not allocate space
immediately). This ensures that after the error occurs, there is enough address
space left to flush the entire file to disk, so no data is lost.
A more complete explanation is at /doc/html/TechNotes/ReservedFileSpace.html
Platforms tested:
sleipnir, copper (parallel), verbena, arabica, Windows (Visual Studio 7)
Solution:
Platforms tested:
Misc. update:
Bug fix
Description:
Correct problems with "resurrecting" a dataset in a file. (This occurs
when a dataset which is open gets unlinked from the group hierarchy (making it
"dead" and marked for deletion in the file) and then is re-linked to the group
hierarchy). Note that the current solution applies only to datasets, further
work will fix this for groups and named datatypes also.
Also, fix the "debug" routines to be a little more helpful in certain
situations.
Additionally, fix a locking bug in the symbol table node splitting routine
which could be [one of] the cause[s] of the file corruption in flexible
parallel operation.
Platforms tested:
FreeBSD 4.10 (sleipnir) w/parallel
h5committested
h5repack changes
Description:
there were some requests to change some minor h5repack features
h5repack only made a warning about a non available filter in verbose mode ( -v )
without -v it kept silent, and users sometimes missed this warning
the request was that it should print this warning always. so, the new format, is e.g
./h5repack -i test_szip.h5 -o out.h5
Warning: dataset </dset_szip> cannot be read, SZIP filter is not available
due to this, and to avoid a lot of these messages in the shell test script, I modified
the script h5repack.sh so that it detects the presence of all filters in the environment
(previously it only detected SZIP)
the test files were also divided in more files , to make the script code easier to
follow
Solution:
Platforms tested:
linux
AIX (no szip)
solaris (no szip, no gzip )
Misc. update:
Bug fix
Description:
Fix error in chunked dataset I/O where data written out wasn't read
correctly from a chunked, extendible dataset after the dataset was extended.
Also, fix parallel I/O tests to gather error results from all processes,
in order to detect errors that only occur on one process.
Solution:
Bypass chunk cache for reads as well as writes, if parallel I/O driver is
used and file is opened for writing.
Platforms tested:
FreeBSD 4.10 (sleipnir) w/parallel
Too minor to require h5committest
Code optimization
Description:
Re-work the insertion of a new child into an existing node, to exploit
some speedups for adding the rightmost child, since this is a very common case
when appending records to an unlimited size dataset.
Also, hoist the checks for the tree's 'K' value into a field in the shared
information about the tree, instead of re-calculating them all the time.
Platforms tested:
Solaris 2.7 (arabica)
FreeBSD 4.10 (sleipnir) w/parallel
Too minor to require h5committest
Code optimization
Description:
Avoid calling vector comparison routine when operating on 1-D chunks.
Platforms tested:
Solaris 2.7 (arabica)
FreeBSD 4.10 (sleipnir) w/parallel
Too minor to require h5committest
Update.
Description:
Update new files into Windows workspace.
Solution:
Add H5RC.c and H5RCprivate.h to hdf5 and hdf5dll projects in Windows workspace.
Platforms tested:
MS Visual C++ 6.0 on Windows 2000.
(will test on Windows XP with Visual C++ 6.0 and .NET after this check-in)
Misc. update:
Code optimization
Description:
Refactor B-tree code to extract all common information for a B-tree into a
shared structure that is pointed to by all the nodes in tree (instead of being
included in each node).
Also re-order B-tree node comparison checks for chunked datasets to
check for >= the upper node first, since the comparison is a bit "heavy" and
this check is more likely to succeed when you are adding records to the
dataset.
Platforms tested:
Solaris 2.7 (arabica)
FreeBSD 4.10 (sleipnir) w/parallel
Too minor to require h5committest
(also, recent h5dump commits have broken testing...)
h5dump new tests
Description:
added new tests for the print of array indices (nested objects, several ranks)
Solution:
Platforms tested:
linux
AIX
solaris
Misc. update:
Description:
Changed call to H5File::getFileSize according to C library and
removed CHECK for this call because failure will be handled by
exception.
Platforms tested:
FreeBSD 4.10 (sleipnir)
Linux 2.4 (eirene)
Description:
Added function headers with doxygen.
Changed H5File::getFileSize according to C library.
Platforms tested:
Linux 2.4 (eirene)
FreeBSD 4.10 (sleipnir)
Misc. update:
Code cleanup & small optimization
Description:
Eliminate redundant recomputation of native key pointer offsets.
Platforms tested:
Solaris 2.7 (arabica)
FreeBSD 4.10 (sleipnir) w/parallel
too minor to require h5committest
Bug fix
Description:
The "shared" raw B-tree node can get freed before all the B-tree nodes
had been flushed out to disk and released by the cache.
Solution:
Implement a simple reference counting wrapper for objects in the library
and use it to hold the shared raw B-tree nodes so they aren't freed before all
references to them in memory are released.
Platforms tested:
Solaris 2.7 (arabica)
FreeBSD 4.10 (sleipnir)
IRIX64 6.5 (modei4)
bug fix
Description:
when printing array indices , the calculation of the current column was not done correctly
Solution:
Platforms tested:
linux
AIX
solaris
Misc. update:
Description:
H5IdComponent.cpp: initialized a pointer to NULL
H5Object.cpp: removed functions being added by mistake
Update function headers for the rest.
Platforms tested:
SunOS 5.7 (arabica)
Linux 2.4 (eirene)
Purpose: Maintenance
Description: Added h5fget_name_f and h5fget_filesize_f subroutines and tests.
Solution: N/A
Platforms tested: arabica (32-bit), sol (64-bit)
parallle build on copper failed for the C library with the the
following error:
ld: 0711-317 ERROR: Undefined symbol: .H5FD_stdio_term
Since this change doesn't affect the C library, I am cheking it in
and will retest the fresh CVS copy after this check-in.
Misc. update:
Update.
Description:
Added cache.c to the Windows tests. Updated H5Tinit.c.
Solution:
1. Added 2 new projects cache and cachedll to the Windows workspace. These two projects include
the new source code cache.c.
2. Updated H5Tinit.c.
Platforms tested:
MS Visual C++ 6.0 and .NET in Windows XP.
Misc. update:
h5dump new tests
Description:
added more tests for the escape/not escape feature for string data (with vlen, with
compound, with char data)
Solution:
Platforms tested:
linux
solaris
AIX
Misc. update:
Code optimization
Description:
Since the raw B-tree nodes are the same size and only used when reading in
or writing out a B-tree node, move raw B-tree node buffer from being per node
to a single node that is shared among all B-tree nodes of a particular tree,
freeing up a lot of space and eliminating lots of memory copies, etc.
Platforms tested:
Solaris 2.7 (arabica)
FreeBSD 4.10 (sleipnir) w/parallel
Too minor to require h5committest
Description:
Removed macro H5_FILES from Makefile.in so that output files from the
example programs will not be installed.
Platforms tested:
Linux 2.4 (eirene)
SunOS 5.7 (arabica)
Purpose: Potential bug fix
Description: In H5Fget_filesize, file size was returned as haddr_t. Change it to hsize_t
and return it as parameter to make fortran interface easier.
Platforms tested: fuss(simple change).
Bug Fix.
Description:
nh5zget_filter_info_c function was not declcared as H5_FCDLL, which is
required for fortran dll in Windows. _H5ZGET_FILTER_INFO_C is considered
as an unresolved external symbol by Fortran Compiler in Windows without
H5_FCDLL.
Solution:
Added H5_FCDLL for nh5zget_filter_info_c function.
Platforms tested:
DEC Fortran 6.0 in Windows XP.
Misc. update:
Description:
These data files are generated by the example programs and shouldn't
need to be in the CVS. Removed them.
Platforms tested:
SunOS 5.7 (arabica)
Linux 2.4 (eirene)
h5dump new tests
Description:
added new tests for the -p option, superblock, file contents, fill values, array indices.
Solution:
Platforms tested:
linux
AIX
solaris
Misc. update:
Update projects in Windows workspace.
Description:
John added three files under hdf5/src. Update these files into windows workspace.
Solution:
1. Added H5C.c to the source folders of hdf5 and hdf5dll projects.
2. Added H5Cprivate.h and H5Cpublic.h to the head folders of hdf5 and hdf5dll projects.
Platforms tested:
Microsoft Visual C++ 6.0 and DEC Fortran 6.0 in Windows XP and Windows 2000.
Misc. update:
Code optimization
Description:
Don't copy layout information, just point to existing information.
Platforms tested:
Solaris 2.7 (arabica)
FreeBSD 4.10 (sleipnir) w/parallel
Too minor to require h5committest
Code optimization & bug fix
Description:
Speed up "fast comparison" lookups in trees by a factor of 2-3x
Correctly handle "fast comparisons" for unsigned values (esp. hsize_t).
Solution:
Mostly removing if statements and redundant assigns, etc.
Platforms tested:
Solaris 2.7 (arabica)
FreeBSD 4.10 (sleipnir) w/parallel
Too minor to require h5committest
Code optimization
Description:
Set up datatype ID for dataset's datatype on disk. This allows us to avoid
repeatedly copying the datatype when an ID is needed.
Also, clean up a few warnings in various other places.
Platforms tested:
Solaris 2.7 (arabica)
FreeBSD 4.10 (sleipnir) w/parallel
Too minor to require h5committest
Code cleanup
Description:
Fix problems when compiling with C++ compiler.
Also clean up some warnings with gcc 3.4.x
Platforms tested:
FreeBSD 4.10 (sleipnir)
Too minor to require h5committest
Purpose: Maintenance
Description: H5_SZIP_CHIP_OPTION_MASK was deleted from the list
of the available parameters for the H5Pset_szip function.
Solution: Updated Fortran source, tests and documentation
to reflect this change.
Platforms tested: arabica (too small for h5committest)
Misc. update:
Purpose:
To more carefully describe the
-- behavior of H5Pset_external
-- appropriate usage of H5Pset_shuffle
Description:
H5Pset_external
Add notes that first H5Pset_external call sets the dataset as EXTERNAL
and identifies the first file in the series of external files that
will hold the dataset; subsequent calls identify additional files;
all external files must be declared before the dataset is created;
and the library will create files that don't yet exist on the system
at the time that H5Dwrite is called to write data to that file.
H5Pset_shuffle
Added notes regarding usage of H5Pset_shuffle in concert with a
compression filter.
And, as always, a few copy edits.
Purpose:
SZIP and more general filter updates
Description:
Added SZIP to the introductory paragraphs and as appropriate in
the Fortran subroutine descriptions.
Revised the introductory discussion, which had previously focussed
on one compression filter, to allow for multiple filters of
different types.
Added list of filters currently distributed with HDF5 to intro.
Also some copy edits and minor formatting.
Platforms tested:
Mozilla, Safari