- Add --has-fortran, in addition to the specific --has-f90, --has-f03
- Add --libdir to print just the libdir
- Also 'which nf-config' will spit out errors if nf-config is not found.
Silence these errors
Modified provenance code to allocate the minimal space
needed for _NCProperties attribute in file. Basically
required using malloc in the provenance code and in ncdump.
Otherwise should cause no externally visible effects.
Also removed the ENABLE_FILEINFO from configure.ac since
the provenance code is no longer optional.
This modifies the previous change to be more pedantically correct. It should always be an NC_EINVALCOORDS error if start exceeds fdims[2]; however, if start equals fdims[2], then it is only an error if count is non-zero.
The following code is in nc4hdf.c, function `nc4_put_vara`.
```
/* Check dimension bounds. Remember that unlimited dimnsions can
* put data beyond their current length. */
for (d2 = 0; d2 < var->ndims; d2++)
{
dim = var->dim[d2];
assert(dim && dim->dimid == var->dimids[d2]);
if (!dim->unlimited)
{
if (start[d2] >= (hssize_t)fdims[d2])
BAIL_QUIET(NC_EINVALCOORDS);
if (start[d2] + count[d2] > fdims[d2])
BAIL_QUIET(NC_EEDGE);
}
}
```
There is an issue when the process with the highest rank has zero items to output. As an example, if I have 4 mpi processes which are each writing the following amount of data:
* rank 0: 0 items
* rank 1: 2548 items
* rank 2: 4352 items
* rank 3: 0 items.
I will define the variable to have a length of 6900 items (0 + 2548 + 4352 + 0). When I am outputting data to the variable, each rank will call nc_put_vara_longlong with the following start and count values:
* rank 0: start = 0, count = 0
* rank 1: start = 0, count = 2548
* rank 2: start = 2548, count = 4352
* rank 3: start = 6900, count = 0.
In each case, the `start` for rank N is equal to `start` for rank N-1 + `count` for rank N-1. This all works ok until the highest rank is writing 0 items. In that case, the `start` value for that rank is equal to the total size of the variable and the check in the code fragment shown above fails since `start[] == fdims[]`.
This could be fixed in the application code by checking whether the `count` is zero and if so, then set `start` to 0 also, but I think that is a kluge that should not be required.
Note that this test appears three times in this file. In one case, the check for non-zero count already exists, but not in the other two. This pull request adds the check to the other two tests.
As best I can tell, this should be ENABLE_PARALLEL4 instead of ENABLE_PARALLEL. ENABLE_PARALLEL is not used other than in a couple documentation files. But, ENABLE_PARALLEL4 is set in the top-level CMakeLists.txt file if a parallel hdf5 library is detected.
If H5Aopen_idx on line 1964 fails, then attid will be < 0. The BAIL will goto exit at line 1989 and then the test of "if (attid ...)" at line 1995 will pass (attd != 0) and then call H5Aclose(attid) with a negative attid. Similar issue for spaceid.
Result of function if probably the same since there is a failure somewhere, but more difficult to track down if looks like failure is happening in the wrong place.
the multiple definitions of
typedef struct DCEparsestate
near lines 10 and 42 causes compiler
problems with some versions of gcc.
remove the second typedef.
The problem is that the name was being updated prior to the old
variable being removed from the hashmap. It checks whether the key and
the name of the variable being removed match, but since the name had
already been updated, the names did not match so the variable was not
removed. This patch removes the variable from the hashmap first,
then updates the name, and then adds the variable with the new name to
the hashmap.
Similar change for renaming dimensions.