1. Allow nccopy to apply filters, especially on the output file.
This provides a third way to do this other than using ncgen or
programatically
2. Make sure that even if the filter code is not available, it is
possible to see the filter id and parameters for variables using
e.g ncdump -hs.
3. Fix bug in nccopy so that the input file does
not necessarily have to be netcdf-4.
4. At last minute decided to change to using a
single "_Filter" attribute for ncgen
5. Added a test to tst_filter.sh to generate C code using ncgen.
This relies on the HDF5 capability to
dynamically load compression filters.
Note that a compression filter is just
a subcase of filters.
The primary user-visible changes are as follows:
1. Add a standard header "netcdf_filter.h" that defines
the necessary API extensions
2. Modify ncgen to support two new special attributes
"_Filter_ID" and "_Filter_Parameters" so that compression
can be turned on when creating a file using ncgen.
4. Add a detailed description of filtering support
to the user's guide; see the file filters.md
5. Add a test case directory for this: nc_test4/filter_test.
It is fragile and a ./configure flags (-enable-filter-test)
is defined (default disabled) to shut this off this test
to avoid spurious 'make check' failures.
Note that the HDF5 documentation is not up-to-date, so
much of what is encoded here comes from examining the
actual code in the file H5PL.c in the HDF5 source code.
1. When running under windows (as opposed to cygwin)
we need to make sure to not user /cygdrive/ file paths.
This was ocurring in libdap4/d4read.c, but may occur
elsewhere.
2. Shell scripts in the git repo are not being checked-out
with the executable mode set. Had core.filemode set to false.
Was a major hassle to fix.
Specific changes:
1. Add dap4 code: libdap4 and dap4_test.
Note that until the d4ts server problem is solved, dap4 is turned off.
2. Modify various files to support dap4 flags:
configure.ac, Makefile.am, CMakeLists.txt, etc.
3. Add nc_test/test_common.sh. This centralizes
the handling of the locations of various
things in the build tree: e.g. where is
ncgen.exe located. See nc_test/test_common.sh
for details.
4. Modify .sh files to use test_common.sh
5. Obsolete separate oc2 by moving it to be part of
netcdf-c. This means replacing code with netcdf-c
equivalents.
5. Add --with-testserver to configure.ac to allow
override of the servers to be used for --enable-dap-remote-tests.
6. There were multiple versions of nctypealignment code. Try to
centralize in libdispatch/doffset.c and include/ncoffsets.h
7. Add a unit test for the ncuri code because of its complexity.
8. Move the findserver code out of libdispatch and into
a separate, self contained program in ncdap_test and dap4_test.
9. Move the dispatch header files (nc{3,4}dispatch.h) to
.../include because they are now shared by modules.
10. Revamp the handling of TOPSRCDIR and TOPBUILDDIR for shell scripts.
11. Make use of MREMAP if available
12. Misc. minor changes e.g.
- #include <config.h> -> #include "config.h"
- Add some no-install headers to /include
- extern -> EXTERNL and vice versa as needed
- misc header cleanup
- clean up checking for misc. unix vs microsoft functions
13. Change copyright decls in some files to point to LICENSE file.
14. Add notes to RELEASENOTES.md
re: github netcdf-c issue #271
This occurs for several reasons, including:
1. using H5Aopen_name instead of H5Aexists to test if attribute exists.
2. using H5Eset_auto instead of H5Eset_auto2.
There are probably others that will have to be extinguished as encountered.
p.s Hope I did not overdo this and kill too much.
The hash field for phony dimensions was not being set
(in nc4hdf.c). Also added test case (nc_test4/?).
Note that I searched for other similar failures and
did not find any, but I may have missed them.
This consists of a persistent attribute named
_NCProperties plus two computed attributes
_IsNetcdf4 and _SuperblockVersion.
See the 'Provenance Attributes' section
of docs/attribute_conventions.md for details.
The count and start arrays are dimensioned to NC_MAX_DIMS even though the maximum size should be NDIMS which is set to 3; possibly, the maximum size could be 2 since only indices 0 and 1 are used to access both start and count. I left it at NDIMS since that matches the number of items in the initialization and is consistent with other uses in the file.
The constant NC_MAX_DIMS is used to dimension the `dimids` arrays which are use to retrieve the dimension ids corresponding to the variables dimensions. By definition, the maximum number of dimension ids is `NC_MAX_VAR_DIMS` as is documented in the `nc_inq_var` documentation:
http://www.unidata.ucar.edu/software/netcdf/netcdf-4/newdocs/netcdf-c/nc_005finq_005fvar.html
dimids Returned vector of *ndimsp dimension IDs corresponding to the variable dimensions. The caller must allocate enough space for a vector of at least *ndimsp integers to be returned. The maximum possible number of dimensions for a variable is given by the predefined constant NC_MAX_VAR_DIMS.
Although the default or standard values of NC_MAX_DIMS and NC_MAX_VAR_DIMS are the same, the correct value to be used here is NC_MAX_VAR_DIMS and not NC_MAX_DIMS. Even though this currently works, it could fail if either NC_MAX_VAR_DIMS or NC_MAX_DIMS is changed and it can also provide an incorrect function usage example that may mislead developers trying to determine the correct usage of the function.
1. Added check to libsrc4/nc4var.nc_def_var_extra to
check that the no specified chunks size is greater than
the dimension size.
2. Added test to nc_test4/tst_chunks.c
There was a problem with the
distcheck testing since the test input files
are in ${src_dir} and the test is in ${build_dir}.
So modify run_chunk_hdf4 to copy as necessary.
It turns out that HDF4 supports chunking
(and compression). However the existing
HDF4 code does not support it.
So add HDF4 support for chunking.
Also add a test case.
error occurs after an "exit:" label.
Corrected a dozen Coverity errors (mainly allocation issues, along with a few
other things):
711711, 711802, 711803, 711905, 970825, 996123, 996124, 1025787,
1047274, 1130013, 1130014, 1139538
Refactored internal fill-value code to correctly handle string types, and
especially to allow NULL pointers and null strings (ie. "") to be
distinguished. The code now avoids partially aliasing the two together
(which only happened on the 'write' side of things and wasn't reflected on
the 'read' side, adding to the previous confusion).
Probably still weak on handling fill-values of variable-length and compound
datatypes.
Refactored the recursive metadata reads a bit more, to process HDF5 named
datatypes and datasets immediately, avoiding chewing up memory for those
types of objects, etc.
Finished uncommenting and updating the nc_test4/tst_fills2.c code (as I'm
proceeding alphabetically through the nc_test4 code files).
Add a new function called nc_inq_format_extended that
returns more detailed format information (vis-a-vis
nc_inq_format) about an open dataset.
Note that the netcdf API will present the file as if it had
the format specified by nc_inq_format. The true file
format, however, may not even be a netcdf file; it might be
DAP, HDF4, or PNETCDF, for example. This function returns
that true file type. It also returns the effective mode for
the file.
signature: nc_inq_format_extended(int ncid, int* formatp, int* modep)
where
* ncid is the NetCDF ID from a previous call to nc_open() or
nc_create().
* formatp is a pointer to a location for returned true format.
* modep is a pointer to a location for returned mode flags.
Refer to the actual list in the file netcdf.h to see the
currently defined set.
Also added test cases (tst_formatx*).
the rest of the dimension queries. Correct error in library where types used
in sub-group variables but that were added to the file after the sub-group was
created weren't available for sub-group variables to use. Start cleaning up
test suite and un-commenting tests that were commented out (got up to
nc_test4/tst_fills2.c, alphabetically) before running into an error in HDF5.
many cleanups to fix compiler warnings, streamline iteration over objects
in HDF5 file when opening the file, and generally straightening out the code
to be cleaner and simpler.
Tested on Mac OS/X with gcc 4.8 and OpenMPI (which uses clang).
an unlimited-dimension variable, and different processes don't agree on the
whether to extend the underlying HDF5 dataset, or don't agree on the amount
to extend the dataset.
Also added ability capability for netCDF-4 to write and read NIL
values for string type attributes and variables, so these can be read
if used in HDF5 files.
Include are additions to CMakeLists files to reflect new tests.
unlimited dimension hanging. Extending the size of an unlimited
dimension in HDF5 must be a collective operation, so now an error is
returned if trying to extend in independent access mode.
Quincey's bug fixes for parallel build portability, particularly
OpenMPI on MacOS-X.
coordinate variables and their associated dimensions occurs in any
subgroup, rather than just the root group. If this occurs, the
variable attribute "_Netcdf4Dimid" is created for every dimension
scale.
Also add a test for this bug fix in tst_dims3.nc, based on Pedro
Vicente's demo.
for JIRA issue NCF-241.
This is only temporary
until I can make pnetcdf
operate as a separate dispatch table.
Also, fix nc_test4/tst_pnetcdf
to open with nc_open_par;
this is necessary because a pnetcdf
created file cannot be opened
as a netcdf classic file.
Fix Jira issue NCF-29 (https://bugtracking.unidata.ucar.edu/browse/NCF-29):
making the netCDF-4 library ignore HDF5 datasets and attributes which have
datatypes (such as references) that it doesn't understand.
Tested on:
Mac OSX/64 10.8.2 (amazon) w/--enable-netcdf-4 --enable-extra-tests --enable-extra-example-tests
--disable-shared --enable-logging
to indicate that they only work for atomic types,
not user defined types.
2. modified NCDEFAULT_{get/put}_vars to no longer use
nc_get/put_varm. They now directly use nc_get/put_vara
directly. This means that nc_get/put_vars now work
properly for user defined types as well as atomic types.
3. Added test cases for get_vars/put_vars with a
user defined type. Tests placed into
nc_test/tst_compounds.c
enhancements, based on contributed code from Martin van Driel, to
support -v, -g, -V, and -G options for selecting groups and variables
in output. Fix all clang warnings from nccopy and ncdump sources, as
well as a few other cleanup changes to testing code.
CMake related changes in CMakeLists.txt files,
cmake_config.h.in. Other changes relate to
Windows-specific issues, and changes made
when regenerating generated source files.
contain as little file-type specific info as possible. It
modifies especially libsrc so that all of the netcdf-3 data
that used to be in struct NC is now kept in a separate chunk
of data pointed to by the struct NC. This makes all of
current protocols consistent: netcdf-3, netcdf-4, and dap.
reported by static analysis, including memory leak in ncdump, missing
size_t cast for chunk cache. Fixed various doc problems, including
byte vs. char issues, missing NC_UBYTE in type list, needed link to
"Building with Windows" page.
implementation. Deleted obsolete win32, soon to be replaced by Ward's
Windows 32- and 64-bit fixes for building with MSYS/MinGW. Made
cosmetic cleanup to output of "make check" to make it easier for users
to interpret. Fixed bug NCF-175: ncdump -t incorrectly interpreting
units attribute (such as "days") without a base time (such as "since
2007-01-01") as a time unit.
Changed name to 4.2.1-beta.
range_error checks in netCDF-4 type conversion code. Made netCDF
attribute tests with type conversion more comprehensive and stringent,
fixing bugs identified with better tests. Changed a test in
nc_test/tst_atts.c to use netCDF-3 file instead of netCDF-4 file,
because that directory is supposed to be for tests that work with
--disable-netcdf-4. Added test demonstrating NCF-171 bug on 32-bit
platforms, only run when configured with --enable-extra-tests.
The in-memory files can be made persistent if nc_create is called with
NC_DISKLESS|NC_WRITE flags set. Initial test case also included.
- Modified ncio mechanism to support
multiple ncio packages; this is so we
can have posixio and memio operating
at the same time.
- cleanup up a bunch of lint issues (unused variables, etc).
- Fix NCF-157 to modify DAP code to support
partial variable retrieval.
- Fix of NCF-154 to solve problem of ncgen
improperly processing data lists for variables
of size greater than 2**18 bytes.
- Fix ncgen processing of char variables that have
multiple unlimited dimensions.
- Partly fix Jira issue: NCF-145 (vlen issues).
- Benchmark program nc_test4/tst_ar4_*) requires arguments
and should only be invoked inside a shell
script; fixed so that they terminate cleanly
if invoked with no arguments.
- Fix the Doxygen processing so it will work
with make distcheck.
- Begin switchover to using an alternative to ncio.
- Begin support for in-memory (diskless) files.