Update utf8proc.[ch] to use the version now
maintained by the Julia Language project
(https://github.com/JuliaLang/utf8proc/blob/master/LICENSE.md).
The license for the previous version was
unacceptable for the Debian and Ubuntu release
systems. The new version both updates the code
and addresses the license issue.
It turns out that the utf8proc software we are using
was turned over to the Julia Language developers
and the license terms changed to allow modification.
(https://github.com/JuliaLang/utf8proc/blob/master/LICENSE.md).
So the fix here is as follows:
1. Wrap the library with a fixed interface: libdispatch/dutf8.c
and include/ncutf8.h.
2. Replace the existing utf8proc code with the new version
from https://github.com/JuliaLang/utf8proc.
3. Add a couple more test cases: nc_test/tst_utf8_validate.c
and nc_test_utf8_phrases.c. If/when I can find a usable
normalization test, I will incorporate that later.
(Re: [netcdfgroup] nccopy fails with corrupted double link list)
shows that ncdump/nccopy was returning EPERM instead of
NC_EDAPCONSTRAINT as an error when we have a malformed constraint.
Also clean up a potential bug that might occur if the user invokes
nc_set_default_format before calling nc_open on a dap url.
not being computed. Fix in nc4file.c.
Not sure how this ever worked for any variable.
What is also weird is that the dim hash is
apparently being computed.
re: github netcdf-c issue #271
This occurs for several reasons, including:
1. using H5Aopen_name instead of H5Aexists to test if attribute exists.
2. using H5Eset_auto instead of H5Eset_auto2.
There are probably others that will have to be extinguished as encountered.
p.s Hope I did not overdo this and kill too much.
The hash field for phony dimensions was not being set
(in nc4hdf.c). Also added test case (nc_test4/?).
Note that I searched for other similar failures and
did not find any, but I may have missed them.
This consists of a persistent attribute named
_NCProperties plus two computed attributes
_IsNetcdf4 and _SuperblockVersion.
See the 'Provenance Attributes' section
of docs/attribute_conventions.md for details.
NetCDF-c Github issue #178 / esupport BNL-694121
The ncgen man pages says:
> Note also that the words variable',dimension', data',group', and
> `types' are legal CDL names, but be careful that there is a space be-
> tween them and any following colon character when used as a variable
> name. This is mostly an issue with attribute declarations.
Ncdump does not obey this rule.
The fix is to modify ncdump/ncdump.c to check if a variable name is
a keyword.
Also added test case.
that were not taking the CDF-5 format into account.
2. Had to update ncgen.1 man page to define the new
k-flag rules to deal with cdf-5.
3. Had to fix some tests that use 'cmp' for comparison;
this really should be deprecated.
3. There was a bug in configure.ac with respect
to using the enable-netcdf-4 flag vs
using disable-netcdf-4.
AC_CHECK_SIZEOF is not working because anti-virus
will not allow very rapid creation/deletion of a
file with same name.
2. modified some test baselines to attempt to fix
Ward's issue
AC_CHECK_SIZEOF is not working because anti-virus
will not allow very rapid creation/deletion of a
file with same name.
2. modified some test baselines to attempt to fix
Ward's issue
AC_CHECK_SIZEOF is not working because anti-virus
will not allow very rapid creation/deletion of a
file with same name.
2. modified some test baselines to attempt to fix
Ward's issue
DAPRCFILE. Note that the value of this environment
variable should be the absolute path of the rc file, not
the path to its containing directory.
2. fixup testauth.sh and add some new tests
3. synch oc
This supports better authorization
handling for DAP requests, especially redirection
based authorization. I also added a new test case
ncdap_tests/testauth.sh.
Specifically, suppose I have a netrc file /tmp/netrc
containing this.
machine uat.urs.earthdata.nasa.gov login xxxxxx password yyyyyy
Also suppose I have a .ocrc file containing these lines
HTTP.COOKIEJAR=/tmp/cookies
HTTP.NETRC=/tmp/netrc
Assume that .ocrc is in the local directory or HOME.
Then this command should work (assuming a valid login and password).
ncdump -h "https://54.86.135.31/opendap/data/nc/fnoc1.nc"
The pnetcdf support was not
properly being used to provide
mpi parallel io for netcdf-3 classic
files. The wrong dispatch table was being
used. The fix was to modify
dfile.c#NC_check_file_type to properly
specify the pnetcdf dispatch table when
use_parallel was true.
If the netCDF-C library is built with the
HDF5 library but without the HDF4 library and one attempts
to open an HDF4 file, an abort occurs rather than returning
a proper error code (NC_ENOTNC).
Fix is to modify dfile.c#NC_check_file_type to properly
#ifdef relevant tests.
There was a problem with the
distcheck testing since the test input files
are in ${src_dir} and the test is in ${build_dir}.
So modify run_chunk_hdf4 to copy as necessary.
should be under ENABLE_DAP_REMOTE_TESTS.
Fixed to make sure that this is so.
Also attempted to fix ncdap_test/CMakeLists.txt,
but probably got it wrong.
HT to Nico Schlomer.
2. Attempted to reduce the number of conversion errors
when -Wconversion is set. Fixed oc2, but
rest of netcdf remains to be done.
HT to Nico Schlomer.
3. When doing #2, I discovered an error in ncgen.y
that has remained hidden. This required some other
test case fixes.