using cygwin64 causes a couple of bugs during make check.
Fixes are as follows:
1. in ncdump.c and nccopy.c change all occurrences of
'return EXIT_FAILURE'
and
'return EXIT_SUCCESS'
with
exit(EXIT_FAILURE)
and
exit(EXIT_SUCCESS)
respectively.
I have no idea why this works.
2. in libdap2/ncdap3a.c#freeNCDAPCOMMON,
remove the call to nc_abort, it is
causing a loop; not sure why this
is not caught under other operating systems.
Some servers do not properly
implement the current DAP2 spec.
It turns out that this server is one of those:
http://nomads.ncep.noaa.gov:9090/
When a reference such as this is made:
http://nomads.ncep.noaa.gov:9090/dods/gens/gens20140123/gep_all_12z?prmslmsl[0][0][0][0:359]
tt is returning this:
Dataset {
float prmslmsl[ens=1][time=1][lat=1][lon=360];
} gens%2fgens20140123%2fgep_all_12z;
when it should be returning this:
Dataset {
Structure {
float prmslmsl[ens=1][time=1][lat=1][lon=360];
} prmslmsl;
} gens%2fgens20140123%2fgep_all_12z;
The reason is that when picking fields out of a grid,
one must maintain the fully qualified name, so the grid
is converted to an enclosing structure.
It turns out that the problem was that
when I create the new structure node, I was
improperly linking it into the existing graph.
This caused a null pointer failure.
Fix is to make sure the relevant field (node->root)
is set.
The code that tests if a path is a url is
faulting when the url does not end in a slash
(e.g. http://thredds-tests.ucar.edu).
The code that tests if a path is a url is
faulting when the url does not end in a slash
(e.g. http://thredds-tests.ucar.edu).
CF-273]/HZY-708311
Solution was to do a null pointer test.
Added a test (tst_misc).
error occurs after an "exit:" label.
Corrected a dozen Coverity errors (mainly allocation issues, along with a few
other things):
711711, 711802, 711803, 711905, 970825, 996123, 996124, 1025787,
1047274, 1130013, 1130014, 1139538
Refactored internal fill-value code to correctly handle string types, and
especially to allow NULL pointers and null strings (ie. "") to be
distinguished. The code now avoids partially aliasing the two together
(which only happened on the 'write' side of things and wasn't reflected on
the 'read' side, adding to the previous confusion).
Probably still weak on handling fill-values of variable-length and compound
datatypes.
Refactored the recursive metadata reads a bit more, to process HDF5 named
datatypes and datasets immediately, avoiding chewing up memory for those
types of objects, etc.
Finished uncommenting and updating the nc_test4/tst_fills2.c code (as I'm
proceeding alphabetically through the nc_test4 code files).
Add a new function called nc_inq_format_extended that
returns more detailed format information (vis-a-vis
nc_inq_format) about an open dataset.
Note that the netcdf API will present the file as if it had
the format specified by nc_inq_format. The true file
format, however, may not even be a netcdf file; it might be
DAP, HDF4, or PNETCDF, for example. This function returns
that true file type. It also returns the effective mode for
the file.
signature: nc_inq_format_extended(int ncid, int* formatp, int* modep)
where
* ncid is the NetCDF ID from a previous call to nc_open() or
nc_create().
* formatp is a pointer to a location for returned true format.
* modep is a pointer to a location for returned mode flags.
Refer to the actual list in the file netcdf.h to see the
currently defined set.
Also added test cases (tst_formatx*).
Fix bug where leading backslash digit
in name was not being properly handled.
The reason was that I accidentally attemped to allow \x... and \0...
escapes in identifiers. This make identifiers
with leading escaped digits not work any more.
Also added test case.
Fix Http Basic Authorization.
The problem is really in oc2.0.
In order for it to work,
the CURLOPT_COOKIEJAR must have
a non-null value. The code
was already there, but not being
used for some reason.
1. fixed cookiejar code in oc2.0
2. synched oc2.0 with netcdf-c/oc2
3. added a test case
effectively o(n cubed); modified to be
o(n squared).
2. If the list of prefetched variables is too long,
(something on the order of 400 variables), then
the server may reject it. Modified code so that
in the case that the set of prefetch'd vars is
the in fact all variables, it does not create a long
request. This does not actually solve the problem
if the prefetch list is long, but not all inclusive.
effectively o(n cubed); modified to be
o(n squared).
2. If the list of prefetched variables is too long,
(something on the order of 400 variables), then
the server may reject it. Modified code so that
in the case that the set of prefetch'd vars is
the in fact all variables, it does not create a long
request. This does not actually solve the problem
if the prefetch list is long, but not all inclusive.
group renaming. The primary API
is 'nc_rename_grp(int grpid, const char* name)'.
No test cases provided yet.
This also required adding a rename_grp entry
to the dispatch tables.
The DAP code compiles constraints into an internal
tree form. When it goes to fetch data, it converts
that tree form into a string suitable for use in a url.
The tree->string code was using '<' when it should be
using '<=' and vice versa.
The fix is to make sure the right operator is used.
revisions are simpler, and I hope, clearer than the all
previous versions. Part of the code taken from the
ucar.nc2.Range.compose function. Modified to handle edge
cases not handled in the Range.compose function.
Problem was that the NC_create
code was not checking for the NC_CLASSIC_MODEL
mode flag in deciding what dispatch table to use.
This meant that it was then defaulting to use
the default format, and if that was changed
to e.g. NC_FORMAT_NETCDF4, then it would try
to create a netcdf-4 format file, even is
NC_CLASSIC_MODEL mode flag was set.
for JIRA issue NCF-241.
This is only temporary
until I can make pnetcdf
operate as a separate dispatch table.
Also, fix nc_test4/tst_pnetcdf
to open with nc_open_par;
this is necessary because a pnetcdf
created file cannot be opened
as a netcdf classic file.
to do prefetch on either a lazy
or eager basis. Lazy means that
the prefetch does not occur
until and unless the client actually
makes a get_var request.
Also repaired a problem where
doing prefetch wrt a url that
has a constraint will prefetch
a whole variable if its constrained
size is small enough, even if the
underlying variable is too large
to warrant prefetch.
to indicate that they only work for atomic types,
not user defined types.
2. modified NCDEFAULT_{get/put}_vars to no longer use
nc_get/put_varm. They now directly use nc_get/put_vara
directly. This means that nc_get/put_vars now work
properly for user defined types as well as atomic types.
3. Added test cases for get_vars/put_vars with a
user defined type. Tests placed into
nc_test/tst_compounds.c
Primarily focused on memory errors falling into a couple different types:
1) Static overrun errors.
2) Dereference uninitialized memory errors.
make distcheck works after applying these fixes, and coverity no longer sees an issue, so hopefully they are properly resolved.
contain as little file-type specific info as possible. It
modifies especially libsrc so that all of the netcdf-3 data
that used to be in struct NC is now kept in a separate chunk
of data pointed to by the struct NC. This makes all of
current protocols consistent: netcdf-3, netcdf-4, and dap.
o Corrected the check for curl when building DLL. For some reason it just assumed it was missing? No real check was performed.
o Made configuration scripts a little more generic.
o Modified daputil.h to externalize nc__testurl on Windows.
The in-memory files can be made persistent if nc_create is called with
NC_DISKLESS|NC_WRITE flags set. Initial test case also included.
- Modified ncio mechanism to support
multiple ncio packages; this is so we
can have posixio and memio operating
at the same time.
- cleanup up a bunch of lint issues (unused variables, etc).
- Fix NCF-157 to modify DAP code to support
partial variable retrieval.
- Fix of NCF-154 to solve problem of ncgen
improperly processing data lists for variables
of size greater than 2**18 bytes.
- Fix ncgen processing of char variables that have
multiple unlimited dimensions.
- Partly fix Jira issue: NCF-145 (vlen issues).
- Benchmark program nc_test4/tst_ar4_*) requires arguments
and should only be invoked inside a shell
script; fixed so that they terminate cleanly
if invoked with no arguments.
- Fix the Doxygen processing so it will work
with make distcheck.
- Begin switchover to using an alternative to ncio.
- Begin support for in-memory (diskless) files.
NCF-42: _Format attribute sometimes being ignored
NCF-43: Fixed unsigned long long parsing.
NCF-47: Make opendap code properly handle illegal names like "x.y" by
supressing them
NCF-49: check for uint type
NCF-50: properly handle username:pwd embedded in urls.