HDF5 can depend on the Z library (in fact required for netCDF). Moved the detection of whether hdf5 was built with zlib up before any other tests that may require linking of the hdf5 library to determine presence/absence of symbols. These tests require that the link line include "-lz" if the hdf5 library was built with libz support.
This is typically handled somewhat automatically if shared libraries are being used, but in the static library case, the explicit dependency needs to be specified. For internal CMake checks, it uses the `CMAKE_REQUIRED_LIBRARIES` list to specify the libraries that should be used in a `CHECK_C_SOURCE_COMPILE` or a `CHECK_LIBRARY_EXISTS` call. In the current CMakeLists.txt ordering, the zlib detection is done _after_ the `CHECK_LIBRARY_EXISTS` calls which can cause them to fail and give an incorrect result about whether the function being tested for exists. With the reordering in this PR, I am able to correctly configure netCDF on a CRAY HPC system that uses static libraries by default.
On some versions of the HDF5 find_package call, it sets `HDF5_C_LIBRARIES` and `HDF5_HL_LIBRARIES`, but does not set the `HDF5_C_LIBRARY` or `HDF5_HL_LIBRARY` to anything. Control then falls out of the if block with these unset and it falls into the default setting at line 792. This does not include the path, so then when the later `CHECK_LIBRARY_EXISTS` calls are run, they do not have the full path to the library and will not link correctly. Since the link fails, the code defaults to thinking that none of the symbols are defined.
I don't think this change will have any affect since it only sets the symbols if they are unset.
## S3 Related Fixes
* Add comprehensive support for specifying AWS profiles to provide access credentials.
* Parse the files "~/.aws/config" and "~/.aws/credentials to provide credentials for the HDF5 ROS3 driver and to locate default region.
* Add a function to obtain the currently active S3 credentials. The search rules are defined in docs/nczarr.md.
* Provide documentation for the new features.
* Modify the struct NCauth (in include/ncauth.h) to replace specific S3 credentials with a profile name.
* Add a unit test to test the operation of profile and credentials management.
* Add support for URLS of the form "s3://<bucket>/<key>"; this requires obtaining a default region.
* Allows the specification of profile and/or region in a URL of the form "#mode=nczarr,...&aws.region=...&aws.profile=..."
## Misc. Fixes
* Move the ezxml code to libdispatch so that it can be used both by DAP4 and nczarr.
* Modify nclist to provide a deep clone operation.
* Modify ncuri to provide a deep clone operation.
* Modify the .rc file format to allow the specification of a path to be tested when looking for an entry in the .rc file.
* Ensure that the NC_rcload function is called.
* Modify nchttp to support setting request headers.
Fixes an issue with strlen() reading outside the stack allocated buffer
by NC4_HDF5_inq_att, when reading a name whose length is NC_MAX_NAME.
Fixes https://bugs.chromium.org/p/oss-fuzz/issues/detail?id=39189 found
on GDAL
==1895951== Conditional jump or move depends on uninitialised value(s)
==1895951== at 0x483EF58: strlen (in /usr/lib/x86_64-linux-gnu/valgrind/vgpreload_memcheck-amd64-linux.so)
==1895951== by 0x48EF73E: ncindexlookup (ncindex.c:60)
==1895951== by 0x48E81DF: nc4_find_grp_att (nc4internal.c:587)
==1895951== by 0x48E5B39: nc4_get_att_ptrs (nc4attr.c:72)
==1895951== by 0x48F98A0: NC4_HDF5_inq_att (hdf5attr.c:818)
==1895951== by 0x48847F7: nc_inq_att (dattinq.c:91)
==1895951== by 0x10D693: pr_att (ncdump.c:767)
==1895951== by 0x110ADB: do_ncdump_rec (ncdump.c:1887)
==1895951== by 0x1112F1: do_ncdump (ncdump.c:2038)
==1895951== by 0x11248B: main (ncdump.c:2478)
==1895951==
==1895951== Use of uninitialised value of size 8
==1895951== at 0x48A24E4: crc64_little (dcrc64.c:173)
==1895951== by 0x48A27F4: NC_crc64 (dcrc64.c:229)
==1895951== by 0x4892D49: NC_hashmapkey (nchashmap.c:159)
==1895951== by 0x489314B: NC_hashmapget (nchashmap.c:263)
==1895951== by 0x48EF75F: ncindexlookup (ncindex.c:60)
==1895951== by 0x48E81DF: nc4_find_grp_att (nc4internal.c:587)
==1895951== by 0x48E5B39: nc4_get_att_ptrs (nc4attr.c:72)
==1895951== by 0x48F98A0: NC4_HDF5_inq_att (hdf5attr.c:818)
==1895951== by 0x48847F7: nc_inq_att (dattinq.c:91)
==1895951== by 0x10D693: pr_att (ncdump.c:767)
==1895951== by 0x110ADB: do_ncdump_rec (ncdump.c:1887)
==1895951== by 0x1112F1: do_ncdump (ncdump.c:2038)
==1895951==
Filter support has three goals:
1. Use the existing HDF5 filter implementations,
2. Allow filter metadata to be stored in the NumCodecs metadata format used by Zarr,
3. Allow filters to be used even when HDF5 is disabled
Detailed usage directions are define in docs/filters.md.
For now, the existing filter API is left in place. So filters
are defined using ''nc_def_var_filter'' using the HDF5 style
where the id and parameters are unsigned integers.
This is a big change since filters affect many parts of the code.
In the following, the terms "compressor" and "filter" and "codec" are generally
used synonomously.
### Filter-Related Changes:
* In order to support dynamic loading of shared filter libraries, a new library was added in the libncpoco directory; it helps to isolate dynamic loading across multiple platforms.
* Provide a json parsing library for use by plugins; this is created by merging libdispatch/ncjson.c with include/ncjson.h.
* Add a new _Codecs attribute to allow clients to see what codecs are being used; let ncdump -s print it out.
* Provide special headers to help support compilation of HDF5 filters when HDF5 is not enabled: netcdf_filter_hdf5_build.h and netcdf_filter_build.h.
* Add a number of new test to test the new nczarr filters.
* Let ncgen parse _Codecs attribute, although it is ignored.
### Plugin directory changes:
* Add support for the Blosc compressor; this is essential because it is the most common compressor used in Zarr datasets. This also necessitated adding a CMake FindBlosc.cmake file
* Add NCZarr support for the big-four filters provided by HDF5: shuffle, fletcher32, deflate (zlib), and szip
* Add a Codec defaulter (see docs/filters.md) for the big four filters.
* Make plugins work with windows by properly adding __declspec declaration.
### Misc. Non-Filter Changes
* Replace most uses of USE_NETCDF4 (deprecated) with USE_HDF5.
* Improve support for caching
* More fixes for path conversion code
* Fix misc. memory leaks
* Add new utility -- ncdump/ncpathcvt -- that does more or less the same thing as cygpath.
* Add a number of new test to test the non-filter fixes.
* Update the parsers
* Convert most instances of '#ifdef _MSC_VER' to '#ifdef _WIN32'
re: Issue https://github.com/Unidata/netcdf-c/issues/2096
The methods nc_set_var_chunk_cache_ints and nc_def_var_chunking_ints
are Fortran entry points for accessing the cache. They are not defined
if netcdf-c is built with --disable-hdf5.
Fix is to create dummy versions that do nothing and return NC_NOERR
when invoked. These dummy versions are defined when USE_HDF5 is false.