Some servers do not properly
implement the current DAP2 spec.
It turns out that this server is one of those:
http://nomads.ncep.noaa.gov:9090/
When a reference such as this is made:
http://nomads.ncep.noaa.gov:9090/dods/gens/gens20140123/gep_all_12z?prmslmsl[0][0][0][0:359]
tt is returning this:
Dataset {
float prmslmsl[ens=1][time=1][lat=1][lon=360];
} gens%2fgens20140123%2fgep_all_12z;
when it should be returning this:
Dataset {
Structure {
float prmslmsl[ens=1][time=1][lat=1][lon=360];
} prmslmsl;
} gens%2fgens20140123%2fgep_all_12z;
The reason is that when picking fields out of a grid,
one must maintain the fully qualified name, so the grid
is converted to an enclosing structure.
It turns out that the problem was that
when I create the new structure node, I was
improperly linking it into the existing graph.
This caused a null pointer failure.
Fix is to make sure the relevant field (node->root)
is set.
error occurs after an "exit:" label.
Corrected a dozen Coverity errors (mainly allocation issues, along with a few
other things):
711711, 711802, 711803, 711905, 970825, 996123, 996124, 1025787,
1047274, 1130013, 1130014, 1139538
Refactored internal fill-value code to correctly handle string types, and
especially to allow NULL pointers and null strings (ie. "") to be
distinguished. The code now avoids partially aliasing the two together
(which only happened on the 'write' side of things and wasn't reflected on
the 'read' side, adding to the previous confusion).
Probably still weak on handling fill-values of variable-length and compound
datatypes.
Refactored the recursive metadata reads a bit more, to process HDF5 named
datatypes and datasets immediately, avoiding chewing up memory for those
types of objects, etc.
Finished uncommenting and updating the nc_test4/tst_fills2.c code (as I'm
proceeding alphabetically through the nc_test4 code files).
Fix Http Basic Authorization.
The problem is really in oc2.0.
In order for it to work,
the CURLOPT_COOKIEJAR must have
a non-null value. The code
was already there, but not being
used for some reason.
1. fixed cookiejar code in oc2.0
2. synched oc2.0 with netcdf-c/oc2
3. added a test case
effectively o(n cubed); modified to be
o(n squared).
2. If the list of prefetched variables is too long,
(something on the order of 400 variables), then
the server may reject it. Modified code so that
in the case that the set of prefetch'd vars is
the in fact all variables, it does not create a long
request. This does not actually solve the problem
if the prefetch list is long, but not all inclusive.
effectively o(n cubed); modified to be
o(n squared).
2. If the list of prefetched variables is too long,
(something on the order of 400 variables), then
the server may reject it. Modified code so that
in the case that the set of prefetch'd vars is
the in fact all variables, it does not create a long
request. This does not actually solve the problem
if the prefetch list is long, but not all inclusive.