Clean up doxygen warnings that were being treated as failures.

This commit is contained in:
Ward Fisher 2023-12-21 09:39:44 -07:00
parent f2eef5a262
commit 3361fc5901
4 changed files with 33 additions and 11 deletions

View File

@ -520,7 +520,7 @@ Modifying the dispatch version requires two steps:
The two should agree in value.
### NC_DISPATCH_VERSION Incompatibility
## NC_DISPATCH_VERSION Incompatibility
When dynamically adding a dispatch table
-- in nc_def_user_format (see libdispatch/dfile.c) --

View File

@ -65,6 +65,28 @@ The concept of a variable-sized type is defined as follows:
then that compound type is variable-sized.
4. All other types are fixed-size.
## A Warning on Backward Compatibility {#filters_compatibility}
The API defined in this document should accurately reflect the
current state of filters in the netCDF-c library. Be aware that
there was a short period in which the filter code was undergoing
some revision and extension. Those extensions have largely been
reverted. Unfortunately, some users may experience some
compilation problems for previously working code because of
these reversions. In that case, please revise your code to
adhere to this document. Apologies are extended for any
inconvenience.
A user may encounter an incompatibility if any of the following appears in user code.
* The function *\_nc\_inq\_var\_filter* was returning the error value NC\_ENOFILTER if a variable had no associated filters.
It has been reverted to the previous case where it returns NC\_NOERR and the returned filter id was set to zero if the variable had no filters.
* The function *nc\_inq\_var\_filterids* was renamed to *nc\_inq\_var\_filter\_ids*.
* Some auxilliary functions for parsing textual filter specifications have been moved to the file *netcdf\_aux.h*. See [Appendix A](#filters_appendixa).
* All of the "filterx" functions have been removed. This is unlikely to cause problems because they had limited visibility.
For additional information, see [Appendix B](#filters_appendixb).
## Enabling A HDF5 Compression Filter {#filters_enable}
HDF5 supports dynamic loading of compression filters using the

View File

@ -8,13 +8,13 @@ This document attempts to record important information about
the internal architecture and operation of the netcdf-c library.
It covers the following issues.
* [Including C++ Code in the netcdf-c Library](#intern_c++)
* [Including C++ Code in the netcdf-c Library](#intern_cpp)
* [Managing instances of variable-length data types](#intern_vlens)
* [Inferring File Types](#intern_infer)
* [Adding a Standard Filter](#intern_filters)
* [Test Interference](#intern_isolation)
# 1. Including C++ Code in the netcdf-c Library {#intern_c++}
# 1. Including C++ Code in the netcdf-c Library {#intern_cpp}
The state of C compiler technology has reached the point where
it is possible to include C++ code into the netcdf-c library

View File

@ -449,16 +449,16 @@ Here are a couple of examples using the _ncgen_ and _ncdump_ utilities.
```
4. Create an nczarr file using S3 as storage and keeping to the pure zarr format.
```
ncgen -4 -lb -o "s3://s3.uswest-1.amazonaws.com/datasetbucket#mode=zarr" dataset.cdl
ncgen -4 -lb -o 's3://s3.uswest-1.amazonaws.com/datasetbucket#mode=zarr dataset.cdl
```
5. Create an nczarr file using the s3 protocol with a specific profile
```
ncgen -4 -lb -o "s3://datasetbucket/rootkey#mode=nczarr,awsprofile=unidata" dataset.cdl
ncgen -4 -lb -o 's3://datasetbucket/rootkey#mode=nczarr,awsprofile=unidata' dataset.cdl
```
Note that the URL is internally translated to this
````
https://s2.<region&gt.amazonaws.com/datasetbucket/rootkey#mode=nczarr,awsprofile=unidata" dataset.cdl
````
```
'https://s2.<region&gt.amazonaws.com/datasetbucket/rootkey#mode=nczarr,awsprofile=unidata' dataset.cdl
```
# References {#nczarr_bib}
@ -473,7 +473,7 @@ collections — High-performance dataset datatypes](https://docs.python.org/2/li
<a name="dynamic_filter_loading">[8]</a> [Dynamic Filter Loading](https://support.hdfgroup.org/HDF5/doc/Advanced/DynamicallyLoadedFilters/HDF5DynamicallyLoadedFilters.pdf)<br>
<a name="official_hdf5_filters">[9]</a> [Officially Registered Custom HDF5 Filters](https://portal.hdfgroup.org/display/support/Registered+Filter+Plugins)<br>
<a name="blosc-c-impl">[10]</a> [C-Blosc Compressor Implementation](https://github.com/Blosc/c-blosc)<br>
<a name="ref_awssdk_conda">[11]</a> [Conda-forge / packages / aws-sdk-cpp](https://anaconda.org/conda-forge/aws-sdk-cpp)<br>
<a name="ref_awssdk_conda">[11]</a> [Conda-forge packages / aws-sdk-cpp](https://anaconda.org/conda-forge/aws-sdk-cpp)<br>
<a name="ref_gdal">[12]</a> [GDAL Zarr](https://gdal.org/drivers/raster/zarr.html)<br>
# Appendix A. Building NCZarr Support {#nczarr_build}
@ -539,7 +539,7 @@ PATH="$PATH:${AWSSDKBIN}"
Then the following options must be specified for cmake.
````
-DAWSSDK_ROOT_DIR=${AWSSDK_ROOT_DIR}
-DAWSSDK_DIR=${AWSSDK_ROOT_DIR}/lib/cmake/AWSSDK"
-DAWSSDK_DIR=${AWSSDK_ROOT_DIR}/lib/cmake/AWSSDK
````
# Appendix B. Amazon S3 Imposed Limits {#nczarr_s3limits}
@ -578,7 +578,7 @@ can in fact be any legal JSON expression.
This "convention" is currently used routinely to help support various
attributes created by other packages where the attribute is a
complex JSON expression. An example is the GDAL Driver
convention <a href="#ref_gdal">[12]</a>, where the value is a complex
convention <a href='#ref_gdal'>[12]</a>, where the value is a complex
JSON dictionary.
In order for NCZarr to be as consistent as possible with Zarr Version 2,