[svn-r9321] Snapshot version 1.7 release 38

This commit is contained in:
HDF Admin 2004-09-26 04:47:53 -05:00
parent 74c322019f
commit 643811be02
6 changed files with 80 additions and 18 deletions

View File

@ -1,4 +1,4 @@
HDF5 version 1.7.38 currently under development
HDF5 version 1.7.39 currently under development
Please refer to the release_docs/INSTALL file for installation instructions.
------------------------------------------------------------------------------

18
configure vendored
View File

@ -1,7 +1,7 @@
#! /bin/sh
# From configure.in Id: configure.in.
# Guess values for system-dependent variables and create Makefiles.
# Generated by GNU Autoconf 2.53 for HDF5 1.7.38.
# Generated by GNU Autoconf 2.53 for HDF5 1.7.39.
#
# Report bugs to <hdfhelp@ncsa.uiuc.edu>.
#
@ -416,8 +416,8 @@ SHELL=${CONFIG_SHELL-/bin/sh}
# Identity of this package.
PACKAGE_NAME='HDF5'
PACKAGE_TARNAME='hdf5'
PACKAGE_VERSION='1.7.38'
PACKAGE_STRING='HDF5 1.7.38'
PACKAGE_VERSION='1.7.39'
PACKAGE_STRING='HDF5 1.7.39'
PACKAGE_BUGREPORT='hdfhelp@ncsa.uiuc.edu'
ac_unique_file="src/H5.c"
@ -935,7 +935,7 @@ if test "$ac_init_help" = "long"; then
# Omit some internal or obsolete options to make the list less imposing.
# This message is too long to be a string in the A/UX 3.1 sh.
cat <<_ACEOF
\`configure' configures HDF5 1.7.38 to adapt to many kinds of systems.
\`configure' configures HDF5 1.7.39 to adapt to many kinds of systems.
Usage: $0 [OPTION]... [VAR=VALUE]...
@ -996,7 +996,7 @@ fi
if test -n "$ac_init_help"; then
case $ac_init_help in
short | recursive ) echo "Configuration of HDF5 1.7.38:";;
short | recursive ) echo "Configuration of HDF5 1.7.39:";;
esac
cat <<\_ACEOF
@ -1146,7 +1146,7 @@ fi
test -n "$ac_init_help" && exit 0
if $ac_init_version; then
cat <<\_ACEOF
HDF5 configure 1.7.38
HDF5 configure 1.7.39
generated by GNU Autoconf 2.53
Copyright 1992, 1993, 1994, 1995, 1996, 1998, 1999, 2000, 2001, 2002
@ -1161,7 +1161,7 @@ cat >&5 <<_ACEOF
This file contains any messages produced by compilers while
running configure, to aid debugging if configure makes a mistake.
It was created by HDF5 $as_me 1.7.38, which was
It was created by HDF5 $as_me 1.7.39, which was
generated by GNU Autoconf 2.53. Invocation command line was
$ $0 $@
@ -34305,7 +34305,7 @@ _ASBOX
} >&5
cat >&5 <<_CSEOF
This file was extended by HDF5 $as_me 1.7.38, which was
This file was extended by HDF5 $as_me 1.7.39, which was
generated by GNU Autoconf 2.53. Invocation command line was
CONFIG_FILES = $CONFIG_FILES
@ -34367,7 +34367,7 @@ _ACEOF
cat >>$CONFIG_STATUS <<_ACEOF
ac_cs_version="\\
HDF5 config.status 1.7.38
HDF5 config.status 1.7.39
configured by $0, generated by GNU Autoconf 2.53,
with options \\"`echo "$ac_configure_args" | sed 's/[\\""\`\$]/\\\\&/g'`\\"

View File

@ -25,7 +25,7 @@ dnl
dnl NOTE: Don't forget to change the version number here when we do a
dnl release!!!
dnl
AC_INIT([HDF5], [1.7.38], [hdfhelp@ncsa.uiuc.edu])
AC_INIT([HDF5], [1.7.39], [hdfhelp@ncsa.uiuc.edu])
AC_CONFIG_SRCDIR([src/H5.c])
AC_CONFIG_HEADER([src/H5config.h])

View File

@ -1,4 +1,4 @@
HDF5 version 1.7.37 released on Sun Sep 12 04:03:26 CDT 2004
HDF5 version 1.7.38 released on Sun Sep 26 04:46:18 CDT 2004
================================================================================
@ -180,10 +180,15 @@ Bug Fixes since HDF5-1.6.0 release
Library
-------
- Fixed parallel bug in which some processes attempted collective
I/O while others did independent I/O. Bug appeared when some
processes used point selections, and others didn't. JRM - 2004/9/15
- Corrected error where dataset region references were written in an
incorrect way on Cray machines. PVN & QAK - 2004/09/13
- The H5Tget_native_type now determines the native type for integers
based on the precision. This is to avoid cases of wrongly converting
an int to a short in machines that have a short of 8 bytes but with
32bit precision (e.g Cray SV1). PVN - 2004/09/07
based on the precision. This is to avoid cases of wrongly converting
an int to a short in machines that have a short of 8 bytes but with
32bit precision (e.g Cray SV1). PVN - 2004/09/07
- Changed H5Dread() to not overwrite data in an application's buffer
with garbage when accessing a chunked dataset with an undefined
fill value and an unwritten chunk is uncountered. QAK - 2004/08/25
@ -627,3 +632,60 @@ ftp://hdf.ncsa.uiuc.edu/pub/outgoing/hdf5/hdf5-1.6.2/F90_source_for_Crays
* Information about building with PGI and Intel compilers is available in
INSTALL file sections 5.7 and 5.8
* On at least one system, (SDSC DataStar), the scheduler (in this case
LoadLeveler) sends job status updates to standard error when you run
any executable that was compiled with the parallel compilers.
This causes problems when running "make check" on parallel builds, as
many of the tool tests function by saving the output from test runs,
and comparing it to an exemplar.
The best solution is to reconfigure the target system so it no longer
inserts the extra text. However, this may not be practical.
In such cases, one solution is to "setenv HDF5_Make_Ignore yes" prior to
the configure and build. This will cause "make check" to continue after
detecting errors in the tool tests. However, in the case of SDSC DataStar,
it also leaves you with some 150 "failed" tests to examine by hand.
A second solution is to write a script to run serial tests and filter
out the text added by the scheduler. A sample script used on SDSC
DataStar is given below, but you will probably have to customize it
for your installation.
Observe that the basic idea is to insert the script as the first item
on the command line which executes the the test. The script then
executes the test and filters out the offending text before passing
it on.
#!/bin/csh
set STDOUT_FILE=~/bin/serial_filter.stdout
set STDERR_FILE=~/bin/serial_filter.stderr
rm -f $STDOUT_FILE $STDERR_FILE
($* > $STDOUT_FILE) >& $STDERR_FILE
set RETURN_VALUE=$status
cat $STDOUT_FILE
tail +3 $STDERR_FILE
exit $RETURN_VALUE
You get the HDF make files and test scipts to execute your filter script
by setting the environment variable "RUNSERIAL" to the full path of the
script prior to running configure for parallel builds. Remember to
"unsetenv RUNSERIAL" before running configure for a serial build.
Note that the RUNSERIAL environment variable exists so that we can
can prefix serial runs as necessary on the target system. On DataStar,
no prefix is necessary. However on an MPICH system, the prefix might
have to be set to something like "/usr/local/mpi/bin/mpirun -np 1" to
get the serial tests to run at all.
In such cases, you will have to include the regular prefix in your
filter script.

View File

@ -1,4 +1,4 @@
HDF5 version 1.7.38 currently under development
HDF5 version 1.7.39 currently under development
================================================================================

View File

@ -77,10 +77,10 @@ extern "C" {
/* Version numbers */
#define H5_VERS_MAJOR 1 /* For major interface/format changes */
#define H5_VERS_MINOR 7 /* For minor interface/format changes */
#define H5_VERS_RELEASE 38 /* For tweaks, bug-fixes, or development */
#define H5_VERS_RELEASE 39 /* For tweaks, bug-fixes, or development */
#define H5_VERS_SUBRELEASE "" /* For pre-releases like snap0 */
/* Empty string for real releases. */
#define H5_VERS_INFO "HDF5 library version: 1.7.38" /* Full version string */
#define H5_VERS_INFO "HDF5 library version: 1.7.39" /* Full version string */
#define H5check() H5check_version(H5_VERS_MAJOR,H5_VERS_MINOR, \
H5_VERS_RELEASE)