Merge pull request #187 from Unidata/issue185

The max dimension sizes do not take CDF-5 format into account.
This commit is contained in:
Ward Fisher 2016-01-08 15:35:38 -07:00
commit a4d826e4fe
21 changed files with 538 additions and 84 deletions

View File

@ -20,4 +20,4 @@ before_install:
script:
- docker run --rm -it -h "$CURHOST" -e USEDASH=FALSE -e RUNF=OFF -e RUNCXX=OFF -e RUNP=OFF -e RUNNCO=OFF -e USECMAKE=$USECMAKE -e USEAC=$USEAC -e COPTS=$COPTS -v $(pwd):/netcdf-c $DOCKIMG
- docker run --rm -it -h "$CURHOST" -e USEDASH=FALSE -e RUNF=OFF -e RUNCXX=OFF -e RUNP=OFF -e RUNNCO=OFF -e USECMAKE=$USECMAKE -e USEAC=$USEAC -e COPTS=$COPTS -e CTEST_OUTPUT_ON_FAILURE=1 -v $(pwd):/netcdf-c $DOCKIMG

248
CONTRIBUTING.html Normal file
View File

@ -0,0 +1,248 @@
<html>
<body>
<img src="http://www.unidata.ucar.edu/images/logos/thredds_tds-150x150.png"/>
<h1>Welcome contributors!</h1>
First off, thank you for your interest in contributing to the THREDDS project!
This repository contains the code for both netCDF-Java and the THREDDS Data Server (TDS) projects.
The other projects held under the THREDDS umbrella are <a href="https://github.com/unidata/rosetta>Rosetta</a> and the latest addition, <a href="https://github.com/unidata/siphon>Siphon</a> (a python interface to the TDS).
<h2>Process Overview</h2>
<ul>
<li> <a href="#gh-setup">GitHub Setup</a>
<ul>
<li> <a href="#gh-join">Join Github!</a>
<li> <a href="#gh-fork">Fork the Unidata THREDDS project</a>
<li> <a href="#gh-pull-ud-thredds">Pull down local copy of the Unidata THREDDS project</a>
<li> <a href="#gh-pull-personal-thredds">Add and pull down a local copy of your THREDDS project fork</a>
</ul>
<li> <a href="#gh-contrib-workflow">Contribution workflow</a>
<ul>
<li> <a href="#gh-sync-ud">Make sure you have the latest changes from Unidata THREDDS repository</a>
<li> <a href="#gh-branch">Make a new branch for your work and start hacking</a>
<li> <a href="#gh-history-cleanup">Clean up your git commit history</a>
<li> <a href="#gh-final-commit-for-pr">Push changes to your fork to use for the pull request</a>
<li> <a href="#gh-pr">Make the pull request</a>
</ul>
<li> <a href="#gh-now-what">Now what?</a>
</ul>
</ul>
<h2><a name="gh-setup">GitHub Setup</a></h2>
<h3><a name="gh-join">Join Github!</a></h3>
To get started contributing to the THREDDS project, the first thing you should do is <a href="https://github.com/join">signup for a free account on GitHub</a>.
<h3><a name="gh-fork">Fork the Unidata THREDDS project</a></h3>
Once you have an account, go ahead and <a href="https://github.com/unidata/thredds#fork-destination-box">fork</a> the THREDDS project.
By forking the project, you will have a complete copy of the THREDDS project, history and all, under your personal account.
This will allow you to make pull requests against the Unidata THREDDS repository, which is the primairy mechanism used to add new code to the project (even Unidata developers do this!).
<h3><a name="gh-pull-ud-thredds">Pull down local copy of the Unidata THREDDS project</a></h3>
After cloning the Unidata repository, you can pull down the source code to your local machine by using git:
<pre>git clone --origin unidata git@github.com:Unidata/thredds.git (for ssh)</pre>
or
<pre>git clone --origin unidata https://github.com/Unidata/thredds.git (for http)</pre>
Note that these commands reference the Unidata repository.
<p>
Normally in git, the remote repository you clone from is automatically named 'origin'.
To help with any confusion when making pull requests, this commands above rename the remote repository to 'unidata'.
<h3><a name="gh-pull-personal-thredds">Add and pull down a local copy of your THREDDS project fork</a></h3>
Next, move into the source directory git has created, and add your personal fork of the THREDDS code as a remote"
<pre>git clone --origin me git@github.com:<my-github-user-name>/thredds.git (for ssh)</pre>
or
<pre>git clone --origin me https://github.com/,my-github-user-name>/thredds.git (for http)</pre>
Now you are all set!
<h2><a name="gh-contrib-workflow">Contribution workflow</a></h2>
<h3><a name="gh-sync-ud">Make sure you have the latest changes from Unidata THREDDS repository</a></h3>
First, make sure you have the most recent changes to the THREDDS code by using git pull:
<pre>git pull unidata master</pre>
<h3><a name="gh-branch">Make a new branch for your work and start hacking</a></h3>
Next, make a new branch where you will actually do the hacking:
<pre>git checkout -b mywork</pre>
As of this point, the branch 'mywork' is local.
To make this branch part of your personal GitHub Remote repository, use the following command:
<pre>git push -u me mywork</pre>
Now git (on your local machine) is setup to a point where you can start hacking on the code and commit changes to your personal GitHub repository.
At any point, you may add commits to your local copy of the repository by using:
<pre>git commit</pre>
If you would like these changes to be stored on your personal remote repository, simply use:
<pre>git push me mywork</pre>
Once you are satisified with your work, there is one last step to complete before submitting the pull request - clean up the history.
<h3><a name="gh-history-cleanup">Clean up your git commit history</a></h3>
Commit history can often be full of temporiariy commit messages, of commits with code changes that ultimately didn't make the final cut.
<p>
To clean up your history, use the <pre>git rebase -i</pre> command, which will open an editor:
<pre>
sarms@flip: [mywork] git rebase -i
pick 083508e first commit of my really cool feature or bug fix!
pick 9bcba01 Oops missed this one thing. This commit fixes that.
# Rebase 083508e..9bcba01 onto 083508e (2 command(s))
#
# Commands:
# p, pick = use commit
# r, reword = use commit, but edit the commit message
# e, edit = use commit, but stop for amending
# s, squash = use commit, but meld into previous commit
# f, fixup = like "squash", but discard this commit</pre>message
# x, exec = run command (the rest of the line) using shell
# d, drop = remove commit
#
# These lines can be re-ordered; they are executed from top to bottom.
#
# If you remove a line here THAT COMMIT WILL BE LOST.
#
# However, if you remove everything, the rebase will be aborted.
#
# Note that empty commits are commented out
<pre>
Based on my commit messages, you can see that commit </pre>1' fixed a mistake from my first commit.
It would be nice to 'squash' those changes into the first commit, so that the official history does not show my mistake..uhhh...this extra commit.
To do so, edit the text to change the second commits 'pick' to 'squash':
<pre>h
pick 083508e first commit of my really cool feature or bug fix!
squash 9bcba01 Oops missed this one thing. This commit fixes that.
# Rebase 083508e..9bcba01 onto 083508e (2 command(s))
#
# Commands:
# p, pick = use commit
# r, reword = use commit, but edit the commit message
# e, edit = use commit, but stop for amending
# s, squash = use commit, but meld into previous commit
# f, fixup = like "squash", but discard this commit</pre>message
# x, exec = run command (the rest of the line) using shell
# d, drop = remove commit
#
# These lines can be re-ordered; they are executed from top to bottom.
#
# If you remove a line here THAT COMMIT WILL BE LOST.
#
# However, if you remove everything, the rebase will be aborted.
#
# Note that empty commits are commented out
<pre>
Once you have marked the commits to be squashed and exited the edit, you will prompted to change the commit message for the new, squashed, mega commit:
</pre>
# This is a combination of 2 commits.
# The first commit's message is:
first commit of my really cool feature or bug fix!
# This is the 2nd commit message:
Oops missed this one thing. This commit fixes that.
#Please enter the commit message for your changes. Lines starting
# with '#' will be ignored, and an empty message aborts the commit.
#
# Date: Thu Oct 15 09:59:23 2015 -0600
#
# interactive rebase in progress; onto 083508e
# Last commands done (2 commands done):
# pick 09134d5 first commit of my really cool feature or bug fix!
# squash 9bcba01 Oops missed this one thing. This commit fixes that.
# No commands remaining.
# You are currently editing a commit while rebasing branch 'mywork' on '0835 08e'.
#
# Changes to be committed:
...
<pre>
Edit the two commit messages into a single message that describes the overall change:
</pre>
Once you have and exit, you will have a change to change the commit message for the new, squashed, mega commit:
<pre>h
Really cool feature or bug fix. Addresses the github issue Unidata/thredds#1
#Please enter the commit message for your changes. Lines starting
# with </pre>l be ignored, and an empty message aborts the commit.
#
# Date: Thu Oct 15 09:59:23 2015 -0600
#
# interactive rebase in progress; onto 083508e
# Last commands done (2 commands done):
# pick 09134d5 first commit of my really cool feature or bug fix!
# squash 9bcba01 Oops missed this one thing. This commit fixes that.
# No commands remaining.
# You are currently editing a commit while rebasing branch 'mywork' on '0835 08e'.
#
# Changes to be committed:
...
<pre>
Now, when you look at your git commit logs, you will see:
</pre>
commit 805b4723c4a2cbbed240354332cd7af57559a1b9
Author: Sean Arms <sarms@ucar.edu>
Date: Thu Oct 15 09:59:23 2015 -0600
Really cool feature or bug fix. Addresses the github issue Unidata/thredds#1
<pre>
Note that the commit conains the text </pre>a/thredds#1'.
This is a cool github trick that will allow you to reference GitHub issues within your commit messages.
When viewed on github.com, this will be turned into a hyperlink to the issue.
While not every contribution will address an issue, please use this feature if your contribution does!
<h3><a name="gh-final-commit-for-pr">Push changes to your fork to use for the pull request</a></h3>
Now that you have cleaned up the history, you will need to make a final push to your personal GitHub repository.
However, the rebase has changed the history of your local branch, which means you will need to use the '--force' flag in your push:
<pre>ush --force me mywork</pre>
<h3><a name="gh-pr">Make the pull request</a></h3>
Finally, go to your personal remote repository on github.com and switch to your 'mywork' branch.
Once you are on your work branch, you will see a button that says "Pull request", which will allow you to make a pull request.
The github pull request page will allow you to select which repository and branch you would like to submit the pull request to (the 'base fork', which should be 'Unidata/thredds', and 'base', which should be 'master'), as well as the 'head fork' and 'compare' (which should be '<github-user-name/thredds>' and 'mywork', respectivly).
Once this is setup, you can make the pull request.
<h2><a name="gh-now-what">Now what?</a></h2>
The Unidata THREDDS project is setup such that automated testing for all pull requests is done via TravisCI.
The status of the tests can be seen on the pull request page.
For example, see <a href="https://github.com/Unidata/thredds/pull/231">Unidata/thredds#231</a> by selecting the 'View Details' button.
This pull request was tested on <a href="https://travis-ci.org/Unidata/thredds/builds/84433104">TravisCI</a> and passed on all versions of Java supported by the current master branch.
We have setup the THREDDS repository such that changes that do not pass these tests cannot be merged.
One of the Unidata THREDDS team members will work with you to make sure your work is ready for merging once the tests have passed on TravisCI.
Note that as changes to your pull request may be required, you can simply push thos changes to your personal GitHub repository and the pull request will automatically be updated and new TravisCI tests will be initiated.
If your pull request addresses a bug, we kindly ask that you include a test in your pull request.
If you do not know how to write tests in Java, we will be more than happy to work with you!
</body>
</html>

6
cf
View File

@ -2,9 +2,9 @@
#NB=1
DB=1
#X=-x
FAST=1
#FAST=1
HDF5=1
#HDF5=1
DAP=1
#PNETCDF=1
#PAR4=1
@ -116,7 +116,7 @@ FLAGS="$FLAGS --disable-examples"
#FLAGS="$FLAGS --disable-dap-remote-tests"
FLAGS="$FLAGS --enable-dap-auth-tests"
#FLAGS="$FLAGS --enable-doxygen"
#FLAGS="$FLAGS --enable-logging"
FLAGS="$FLAGS --enable-logging"
#FLAGS="$FLAGS --disable-diskless"
#FLAGS="$FLAGS --enable-mmap"
#FLAGS="$FLAGS --with-udunits"

View File

@ -1,25 +1,62 @@
# Is visual studio being used?
#VS=yes
CYGWIN=yes
if test "x$VS" = x ; then
#CC=mpicc
CC=gcc
fi
export CC
if test "x$VS" != x -a "x$CYGWIN" != x ; then
ZLIB=cygz.dll; H5LIB=cyghdf5.dll; H5LIB_HL=cyghdf5_hl.dll; CURLLIB=cygcurl.dll
elif test "x$VS" = x -a "x$CYGWIN" != x ; then
ZLIB=libz.dll.a; H5LIB=libhdf5.dll.a; H5LIB_HL=libhdf5_hl.dll.a; CURLLIB=libcurl.dll.a
elif test "x$VS" = x -a "x$CYGWIN" == x ; then
ZLIB=libz.so; H5LIB=libhdf5.so; H5LIB_HL=libhdf5_hl.so; CURLLIB=libcurl.so
else
echo "cannot determine library names"
exit 1
fi
for p in /usr/bin /usr/local/bin /usr/local/lib /usr/lib ; do
if test -f $p/$ZLIB ; then ZP=$p; fi
if test -f $p/$H5LIB ; then HP=$p; fi
if test -f $p/$CURLLIB ; then CP=$p; fi
done
if test "x$ZP" = x ; then echo "Cannot find z lib" ; exit 1; fi
if test "x$HP" = x ; then echo "Cannot find hdf5 lib" ; exit 1; fi
if test "x$CP" = x ; then echo "Cannot find curl lib" ; exit 1; fi
if test "x$CYGWIN" != x -a "x$VS" != x; then
ZP=`cygpath -w "$ZP"`
HP=`cygpath -w "$HP"`
CP=`cygpath -w "$CP"`
fi
#if test "x$VS" != x ; then USR=c:/cygwin/usr; else USR=/usr; fi
ZLIB="-DZLIB_LIBRARY=${ZP}/$ZLIB -DZLIB_INCLUDE_DIR=${ZP}/include -DZLIB_INCLUDE_DIRS=${ZP}/include"
HDF5="-DHDF5_LIB=${HP}/$H5LIB -DHDF5_HL_LIB=${HP}/$H5LIB_HL -DHDF5_INCLUDE_DIR=${HP}/include"
CURL="-DCURL_LIBRARY=${CP}/$CURLLIB -DCURL_INCLUDE_DIR=${CP}/include -DCURL_INCLUDE_DIRS=${CP}/include"
FLAGS="$FLAGS -DCMAKE_C_FLAGS='-Wall -Wno-unused-but-set-variable -Wno-unused-variable -Wno-unused-parameter'"x2
#FLAGS="$FLAGS -DENABLE_DAP=false"
#FLAGS="$FLAGS -DENABLE_NETCDF_4=false"
FLAGS="$FLAGS -DCMAKE_INSTALL_PREFIX=$USR/local"
#FLAGS="-DCMAKE_PREFIX_PATH=$PPATH"
#FLAGS="$FLAGS -DCMAKE_PREFIX_PATH=$PPATH"
FLAGS="$FLAGS -DENABLE_DAP_REMOTE_TESTS=true"
#FLAGS="$FLAGS -DENABLE_DAP_AUTH_TESTS=true"
rm -fr build
mkdir build
cd build
export CC=mpicc
for p in /usr/local/lib /usr/lib ; do
if test -z "$ZP" -a -f $p/libz.so ; then ZP=$p; fi
if test -z "$HP" -a -f $p/libhdf5.so ; then HP=$p; fi
if test -z "$CP" -a -f $p/libcurl.so ; then CP=$p; fi
done
ZLIB="-DZLIB_LIBRARY=${ZP}/libz.so -DZLIB_INCLUDE_DIR=${ZP}/include -DZLIB_INCLUDE_DIRS=${ZP}/include"
HDF5="-DHDF5_LIB=${HP}/libhdf5.so -DHDF5_HL_LIB=${HP}/libhdf5_hl.so -DHDF5_INCLUDE_DIR=${HP}/include"
CURL="-DCURL_LIBRARY=${CP}/libcurl.so -DCURL_INCLUDE_DIR=${CP}/include -DCURL_INCLUDE_DIRS=${CP}/include"
FLAGS="$FLAGS -DCMAKE_INSTALL_PREFIX=/usr/local"
#FLAGS="-DCMAKE_PREFIX_PATH=$PPATH"
#FLAGS="$FLAGS -DCMAKE_PREFIX_PATH=$PPATH"
FLAGS="$FLAGS -DENABLE_DAP_REMOTE_TESTS=true"
FLAGS="$FLAGS -DENABLE_DAP_AUTH_TESTS=true"
cmake $FLAGS ${ZLIB} ${HDF5} ${CURL} ..
cmake --build .
cmake --build . --target test
CTEST_OUTPUT_ON_FAILURE=1 cmake --build . --target test

View File

@ -802,7 +802,7 @@ AC_STRUCT_ST_BLKSIZE
UD_CHECK_IEEE
AC_TYPE_SIZE_T
AC_TYPE_OFF_T
AC_CHECK_TYPES([ssize_t, ptrdiff_t, uchar, longlong, ushort, uint, int64, uint64])
AC_CHECK_TYPES([size_t, ssize_t, ptrdiff_t, uchar, longlong, ushort, uint, int64, uint64])
AC_C_CHAR_UNSIGNED
AC_C_BIGENDIAN

View File

@ -339,12 +339,13 @@ NC3_def_dim(int ncid, const char *name, size_t size, int *dimidp)
if(status != NC_NOERR)
return status;
if ((ncp->flags & NC_64BIT_OFFSET) && sizeof(off_t) > 4) {
/* CDF2 format and LFS */
if(size > X_UINT_MAX - 3) /* "- 3" handles rounded-up size */
if(ncp->flags & NC_64BIT_DATA) {/*CDF-5*/
if((sizeof(size_t) > 4) && (size > X_UINT64_MAX - 3)) /* "- 3" handles rounded-up size */
return NC_EDIMSIZE;
} else {
/* CDF1 format */
} else if(ncp->flags & NC_64BIT_OFFSET) {/* CDF2 format and LFS */
if((sizeof(size_t) > 4) && (size > X_UINT_MAX - 3)) /* "- 3" handles rounded-up size */
return NC_EDIMSIZE;
} else {/*CDF-1*/
if(size > X_INT_MAX - 3)
return NC_EDIMSIZE;
}

View File

@ -54,8 +54,10 @@ ENDIF()
IF(ENABLE_TESTS)
ADD_EXECUTABLE(rewrite-scalar rewrite-scalar.c)
ADD_EXECUTABLE(bom bom.c)
ADD_EXECUTABLE(tst_dimsizes tst_dimsizes.c)
TARGET_LINK_LIBRARIES(rewrite-scalar netcdf)
TARGET_LINK_LIBRARIES(bom netcdf)
TARGET_LINK_LIBRARIES(tst_dimsizes netcdf)
IF(MSVC)
SET_TARGET_PROPERTIES(rewrite-scalar PROPERTIES RUNTIME_OUTPUT_DIRECTORY
@ -71,6 +73,13 @@ IF(MSVC)
${CMAKE_CURRENT_BINARY_DIR})
SET_TARGET_PROPERTIES(bom PROPERTIES RUNTIME_OUTPUT_DIRECTORY_RELEASE
${CMAKE_CURRENT_BINARY_DIR})
SET_TARGET_PROPERTIES(tst_dimsizes PROPERTIES RUNTIME_OUTPUT_DIRECTORY
${CMAKE_CURRENT_BINARY_DIR})
SET_TARGET_PROPERTIES(tst_dimsizes PROPERTIES RUNTIME_OUTPUT_DIRECTORY_DEBUG
${CMAKE_CURRENT_BINARY_DIR})
SET_TARGET_PROPERTIES(tst_dimsizes PROPERTIES RUNTIME_OUTPUT_DIRECTORY_RELEASE
${CMAKE_CURRENT_BINARY_DIR})
ENDIF()
# Base tests
@ -96,10 +105,9 @@ ENDIF()
add_sh_test(ncdump tst_nccopy3)
add_sh_test(ncdump tst_charfill)
add_sh_test(ncdump tst_formatx3)
add_sh_test(ncdump tst_bom)
add_sh_test(ncdump tst_dimsizes)
# The following test script invokes
# gcc directly.

24
ncdump/Make0 Normal file
View File

@ -0,0 +1,24 @@
# Test c output
T=tst_dimsizes
#CMD=valgrind --leak-check=full
#CMD=gdb --args
#MPI=1
LLP="LD_LIBRARY_PATH=/usr/local/lib"
ifndef MPI
CC=gcc
CFLAGS=-g -O0 -I.. -I../include
LDFLAGS=../liblib/.libs/libnetcdf.a -L/usr/local/lib -lhdf5_hl -lhdf5 -lz -lm -lcurl
else
CC=/usr/local/bin/mpicc
LDFLAGS=../liblib/.libs/libnetcdf.a -L/usr/local/lib -lhdf5_hl -lhdf5 -lz ../liblib/.libs/libnetcdf.a -ldl -lcurl -lpnetcdf -lmpich -lm
endif
all::
export ${LLP}; export CFLAGS; export LDFLAGS; \
${CC} -o $T.exe ${CFLAGS} ${T}.c ${LDFLAGS}; \
${CMD} ./$T.exe
cpp::
${CC} -E ${CFLAGS} ${T}.c > ${T}.txt

View File

@ -28,10 +28,11 @@ man_MANS = ncdump.1 nccopy.1
if BUILD_TESTSETS
#if !BUILD_DLL
# These tests are run for both netCDF-4 and non-netCDF-4 builds.
check_PROGRAMS = rewrite-scalar ctest ctest64 ncdump tst_utf8 bom
check_PROGRAMS = rewrite-scalar ctest ctest64 ncdump tst_utf8 bom tst_dimsizes
TESTS = tst_inttags.sh run_tests.sh tst_64bit.sh ctest ctest64 tst_output.sh \
tst_lengths.sh tst_calendars.sh tst_utf8 run_utf8_tests.sh \
tst_nccopy3.sh tst_charfill.sh tst_iter.sh tst_formatx3.sh tst_bom.sh
tst_nccopy3.sh tst_charfill.sh tst_iter.sh tst_formatx3.sh tst_bom.sh \
tst_dimsizes.sh
if LARGE_FILE_TESTS
TESTS += tst_iter.sh
@ -112,7 +113,8 @@ iter.* \
tst_nc_test_netcdf4_4_0.cdl tst_mud4.nc tst_mud4.cdl tst_mud4-bc.cdl \
tst_ncf213.cdl tst_ncf213.nc tst_h_scalar.cdl tst_h_scalar.nc \
tst_mud4_chars.cdl tst_mud4_chars.nc \
inttags.nc inttags4.nc tst_inttags.cdl tst_inttags4.cdl
inttags.nc inttags4.nc tst_inttags.cdl tst_inttags4.cdl \
tst_dimsize_classic.nc tst_dimsize_64offset.nc tst_dimsize_64data.nc
# These files all have to be included with the distribution.
EXTRA_DIST = run_tests.sh tst_64bit.sh tst_output.sh test0.cdl \
@ -139,7 +141,8 @@ ref_tst_ncf213.cdl tst_h_scalar.sh \
run_utf8_nc4_tests.sh \
tst_formatx3.sh tst_formatx4.sh ref_tst_utf8_4.cdl \
tst_inttags.sh tst_inttags4.sh \
CMakeLists.txt XGetopt.c tst_bom.sh tst_inmemory_nc3.sh tst_inmemory_nc4.sh
CMakeLists.txt XGetopt.c tst_bom.sh tst_inmemory_nc3.sh \
tst_dimsizes.sh tst_inmemory_nc4.sh
# CDL files and Expected results
SUBDIRS=cdl expected

View File

@ -109,12 +109,16 @@ chunkspec_parse(int ncid, const char *spec) {
if(ret != NC_NOERR)
return(ret);
chunksize = dimlen;
} else { /* convert nnn string to long integer */
} else { /* convert nnn string to long long integer */
char *ep;
long val = strtol(pp, &ep, 0);
#ifdef HAVE_STRTOLL
long long val = strtoll(pp, &ep, 0);
#else
long long val = strtol(pp, &ep, 0);
#endif
if(ep == pp || errno == ERANGE || val < 1) /* allow chunksize bigger than dimlen */
return (NC_EINVAL);
chunksize = val;
chunksize = (size_t)val;
}
chunkspecs.chunksizes[idim] = chunksize;
idim++;

View File

@ -1561,7 +1561,7 @@ do_ncdump_rec(int ncid, const char *path)
printf ("UNLIMITED ; // (%u currently)\n",
(unsigned int)dims[dimid].size);
} else {
printf ("%u ;\n", (unsigned int)dims[dimid].size);
printf ("%llu ;\n", (unsigned long long)dims[dimid].size);
}
}
#endif /* USE_NETCDF4 */

View File

@ -70,7 +70,8 @@ typedef struct { /* specification for how to format dump */
int nc_kind; /* kind of netCDF file named on
* command line, 1 (classic), 2
* (64-bit offset), 3 (netCDF-4), 4
* (netCDF-4 classic model) */
* (netCDF-4 classic model), 5 (64-bit data)
*/
int nc_extended; /* extended format info fornetCDF file named
* on command line.
*/

80
ncdump/tst_dimsizes.c Normal file
View File

@ -0,0 +1,80 @@
#include <nc_tests.h>
#include <stdio.h>
#include <stdlib.h>
#include <netcdf.h>
#define FILECLASSIC "tst_dimsize_classic.nc"
#define FILE64OFFSET "tst_dimsize_64offset.nc"
#define FILE64DATA "tst_dimsize_64data.nc"
#define DIMMAXCLASSIC (NC_MAX_INT - 3)
#define DIMMAX64OFFSET (NC_MAX_UINT - 3)
#define DIMMAX64DATA (NC_MAX_UINT64 - 3)
/*
Test that at least the meta-data works
for dimension sizes X modes.
NC_CLASSIC => NC_INT_MAX - 3
NC_64BIT_OFFSET => NC_UINT_MAX - 3
NC_64BIT_DATA => NC_UINT64_MAX - 3
Note that this will not test the last case when
|size_t| == 4.
Also, leave the files around so we can test with ncdump.
*/
int
main(int argc, char **argv)
{
int ncid;
size_t dimsize;
int dimid;
int stat = NC_NOERR;
printf("\n*** Testing Max Dimension Sizes\n");
printf("\n|size_t|=%lu\n",(unsigned long)sizeof(size_t));
printf("\n*** Writing Max Dimension Size For NC_CLASSIC\n");
if ((stat=nc_create(FILECLASSIC, NC_CLOBBER, &ncid))) ERRSTAT(stat);
dimsize = DIMMAXCLASSIC;
if ((stat=nc_def_dim(ncid, "testdim", dimsize, &dimid))) ERRSTAT(stat);
if ((stat=nc_close(ncid))) ERRSTAT(stat);
printf("\n*** Reading Max Dimension Size For NC_CLASSIC\n");
if ((stat=nc_open(FILECLASSIC, NC_NOCLOBBER, &ncid))) ERRSTAT(stat);
if ((stat=nc_inq_dimid(ncid, "testdim", &dimid))) ERRSTAT(stat);
if ((stat=nc_inq_dimlen(ncid, dimid, &dimsize))) ERRSTAT(stat);
if(dimsize != DIMMAXCLASSIC) ERR;
if ((stat=nc_close(ncid))) ERRSTAT(stat);
printf("\n*** Writing Max Dimension Size For NC_64BIT_OFFSET\n");
if ((stat=nc_create(FILE64OFFSET, NC_CLOBBER | NC_64BIT_OFFSET, &ncid))) ERRSTAT(stat);
dimsize = DIMMAX64OFFSET;
if ((stat=nc_def_dim(ncid, "testdim", dimsize, &dimid))) ERRSTAT(stat);
if ((stat=nc_close(ncid))) ERRSTAT(stat);
printf("\n*** Reading Max Dimension Size For NC_64BIT_OFFSET\n");
if ((stat=nc_open(FILE64OFFSET, NC_NOCLOBBER|NC_64BIT_OFFSET, &ncid))) ERRSTAT(stat);
if ((stat=nc_inq_dimid(ncid, "testdim", &dimid))) ERRSTAT(stat);
if ((stat=nc_inq_dimlen(ncid, dimid, &dimsize))) ERRSTAT(stat);
if(dimsize != DIMMAX64OFFSET) ERR;
if ((stat=nc_close(ncid))) ERRSTAT(stat);
if(sizeof(size_t) == 8) {
printf("\n*** Writing Max Dimension Size For NC_64BIT_DATA\n");
if ((stat=nc_create(FILE64DATA, NC_CLOBBER | NC_64BIT_DATA, &ncid))) ERRSTAT(stat);
dimsize = (size_t)DIMMAX64DATA;
if ((stat=nc_def_dim(ncid, "testdim", dimsize, &dimid))) ERRSTAT(stat);
if ((stat=nc_close(ncid))) ERRSTAT(stat);
printf("\n*** Reading Max Dimension Size For NC_64BIT_DATA\n");
if ((stat=nc_open(FILE64DATA, NC_NOCLOBBER|NC_64BIT_DATA, &ncid))) ERRSTAT(stat);
if ((stat=nc_inq_dimid(ncid, "testdim", &dimid))) ERRSTAT(stat);
if ((stat=nc_inq_dimlen(ncid, dimid, &dimsize))) ERRSTAT(stat);
if(dimsize != DIMMAX64DATA) ERR;
if ((stat=nc_close(ncid))) ERRSTAT(stat);
}
SUMMARIZE_ERR;
FINAL_RESULTS;
}

57
ncdump/tst_dimsizes.sh Executable file
View File

@ -0,0 +1,57 @@
#!/bin/sh
echo "*** Test Maximum dimension sizes X mode"
set -x
if test "x$SETX" = x1 ; then echo "file=$0"; set -x ; fi
# This shell script tests max dimension sizes X mode
RETURN=0
if test "x$srcdir" = "x"; then
srcdir=`dirname $0`;
fi
# add hack for sunos
export srcdir;
echo ""
rm -f tst_dimsize_classic.nc tst_dimsize_64offset.nc tst_dimsize_64data.nc
echo "*** Generate: tst_dimsize_classic.nc tst_dimsize_64offset.nc tst_dimsize_64data.nc"
./tst_dimsizes
echo "*** Verify that ncdump can read dimsizes"
rm -fr ./tmp
if ../ncdump/ncdump -h tst_dimsize_classic.nc > ./tmp ; then
echo "*** PASS: ncdump tst_dimsize_classic.nc"
else
echo "*** FAIL: ncdump tst_dimsize_classic.nc"
RETURN=1
fi
rm -fr ./tmp
if ../ncdump/ncdump -h tst_dimsize_64offset.nc > ./tmp ; then
echo "*** PASS: ncdump tst_dimsize_64offset.nc"
else
echo "*** FAIL: ncdump tst_dimsize_64offset.nc"
RETURN=1
fi
if test -f tst_dimsize_64data.nc ; then
rm -fr ./tmp
if ../ncdump/ncdump -h tst_dimsize_64data.nc > ./tmp ; then
echo "*** PASS: ncdump tst_dimsize_64data.nc"
else
echo "*** FAIL: ncdump tst_dimsize_64data.nc"
RETURN=1
fi
fi
# Cleanup
rm -f tmp tst_dimsize_classic.nc tst_dimsize_64offset.nc tst_dimsize_64data.nc
exit $RETURN

View File

@ -47,6 +47,11 @@ gen_netcdf(const char *filename)
ngrps = listlength(grpdefs);
#endif /*USE_NETCDF4*/
/* Turn on logging */
#ifdef LOGGING
nc_set_log_level(ncloglevel);
#endif
/* create netCDF file, uses NC_CLOBBER mode */
cmode_modifier |= NC_CLOBBER;
#ifdef USE_NETCDF4
@ -255,7 +260,8 @@ Generate type definitions
static void
genbin_deftype(Symbol* tsym)
{
int i,stat;
unsigned long i;
int stat;
ASSERT(tsym->objectclass == NC_TYPE);
switch (tsym->subclass) {
@ -321,7 +327,7 @@ genbin_deftype(Symbol* tsym)
efield->typ.basetype->ncid);
} else {
int j;
int dimsizes[NC_MAX_VAR_DIMS];
int dimsizes[NC_MAX_VAR_DIMS]; /* int because inside compound */
/* Generate the field dimension constants*/
for(j=0;j<efield->typ.dimset.ndims;j++) {
unsigned int size = efield->typ.dimset.dimsyms[j]->dim.declsize;

View File

@ -159,6 +159,7 @@ extern int cdf5_flag; /* 1 => cdf-5 unsigned types in the parse */
extern int specials_flag; /* 1 => special attributes are present */
extern int usingclassic; /* 1 => k_flag == 1|2|5 */
extern int k_flag;
extern int ncloglevel;
/* Global data */

View File

@ -50,7 +50,7 @@ getfiller(Symbol* tvsym)
static void
fill(Symbol* tsym, Datalist* filler)
{
int i;
unsigned long i;
NCConstant con = nullconstant;
Datalist* sublist;
@ -165,7 +165,7 @@ nc_getfill(NCConstant* value)
case NC_UINT64: value->value.uint64v = NC_FILL_UINT64; break;
case NC_STRING:
value->value.stringv.stringv = nulldup(NC_FILL_STRING);
value->value.stringv.len = strlen(NC_FILL_STRING);
value->value.stringv.len = (int)strlen(NC_FILL_STRING);
/* Exception: if string is null, then make it's length be 1 */
if(value->value.stringv.len == 0)
value->value.stringv.len = 1;

View File

@ -45,8 +45,8 @@ int cdf5_flag; /* 1 => cdf5 | maybe netcdf-4 */
int specials_flag; /* 1=> special attributes are present */
int usingclassic;
int cmode_modifier;
int diskless;
int ncloglevel;
char* binary_ext = ".nc";
@ -236,15 +236,19 @@ main(
enhanced_flag = 0;
cdf5_flag = 0;
specials_flag = 0;
diskless = 0;
#ifdef LOGGING
ncloglevel = NC_TURN_OFF_LOGGING;
#else
ncloglevel = -1;
#endif
#if _CRAYMPP && 0
/* initialize CRAY MPP parallel-I/O library */
(void) par_io_init(32, 32);
#endif
while ((c = getopt(argc, argv, "134567bB:cdD:fhHk:l:M:no:Pv:x")) != EOF)
while ((c = getopt(argc, argv, "134567bB:cdD:fhHk:l:M:no:Pv:xL:")) != EOF)
switch(c) {
case 'd':
debug = 1;
@ -304,6 +308,9 @@ main(
return(1);
}
}; break;
case 'L':
ncloglevel = atoi(optarg);
break;
case 'n': /* old version of -b, uses ".cdf" extension */
if(l_flag != 0) {
fprintf(stderr,"Please specify only one language\n");
@ -465,7 +472,7 @@ main(
return 1;
case '\xEF':
/* skip the BOM */
fread(bom,1,1,fp);
(void)fread(bom,1,1,fp);
break;
default: /* legal printable char, presumably; rewind */
rewind(fp);

View File

@ -159,8 +159,8 @@ NCConstant constant;
INT64_CONST /* long long constant */
UBYTE_CONST /* unsigned byte constant */
USHORT_CONST /* unsigned short constant */
UINT_CONST /* unsigned long long constant */
UINT64_CONST /* unsigned int constant */
UINT_CONST /* unsigned int constant */
UINT64_CONST /* unsigned long long constant */
FLOAT_CONST /* float constant */
DOUBLE_CONST/* double constant */
DIMENSIONS /* keyword starting dimensions section, if any */
@ -438,35 +438,11 @@ dimdeclist: dimdecl
;
dimdecl:
dimd '=' UINT_CONST
dimd '=' UINT64_CONST
{
$1->dim.declsize = (size_t)uint32_val;
$1->dim.declsize = (size_t)uint64_val;
#ifdef GENDEBUG1
fprintf(stderr,"dimension: %s = %lu\n",$1->name,(unsigned long)$1->dim.declsize);
#endif
}
| dimd '=' INT_CONST
{
if(int32_val <= 0) {
derror("dimension size must be positive");
YYABORT;
}
$1->dim.declsize = (size_t)int32_val;
#ifdef GENDEBUG1
fprintf(stderr,"dimension: %s = %lu\n",$1->name,(unsigned long)$1->dim.declsize);
#endif
}
| dimd '=' DOUBLE_CONST
{ /* for rare case where 2^31 < dimsize < 2^32 */
if (double_val <= 0)
yyerror("dimension length must be positive");
if (double_val > MAXFLOATDIM)
yyerror("dimension too large");
if (double_val - (size_t) double_val > 0)
yyerror("dimension length must be an integer");
$1->dim.declsize = (size_t)double_val;
#ifdef GENDEBUG1
fprintf(stderr,"dimension: %s = %lu\n",$1->name,(unsigned long)$1->dim.declsize);
fprintf(stderr,"dimension: %s = %llu\n",$1->name,(unsigned long long)$1->dim.declsize);
#endif
}
| dimd '=' NC_UNLIMITED_K

View File

@ -10,7 +10,7 @@
typedef struct Alignment {
char* typename;
int alignment;
unsigned int alignment;
} Alignment;
/* Define indices for every primitive C type */

View File

@ -174,7 +174,7 @@ Return NULL if symbol is not unique or not found at all.
static Symbol*
uniquetreelocate(Symbol* refsym, Symbol* root)
{
int i;
unsigned long i;
Symbol* sym = NULL;
/* search the root for matching name and major type*/
sym = lookupingroup(refsym->objectclass,refsym->name,root);
@ -200,7 +200,7 @@ Compute the fqn for every top-level definition symbol
static void
computefqns(void)
{
int i,j;
unsigned long i,j;
/* Groups first */
for(i=0;i<listlength(grpdefs);i++) {
Symbol* sym = (Symbol*)listget(grpdefs,i);
@ -259,7 +259,8 @@ computefqns(void)
static void
processtypes(void)
{
int i,j,keep,added;
unsigned long i,j;
int keep,added;
List* sorted = listnew(); /* hold re-ordered type set*/
/* Prime the walk by capturing the set*/
/* of types that are dependent on primitive types*/
@ -352,7 +353,7 @@ static int
tagvlentypes(Symbol* tsym)
{
int tagged = 0;
int j;
unsigned long j;
switch (tsym->subclass) {
case NC_VLEN:
tagged = 1;
@ -385,7 +386,7 @@ filltypecodes(void)
static void
processenums(void)
{
int i,j;
unsigned long i,j;
List* enumids = listnew();
for(i=0;i<listlength(typdefs);i++) {
Symbol* sym = (Symbol*)listget(typdefs,i);
@ -419,7 +420,7 @@ processenums(void)
static void
processeconstrefs(void)
{
int i;
unsigned long i;
/* locate all the datalist and walk them recursively */
for(i=0;i<listlength(attdefs);i++) {
Symbol* att = (Symbol*)listget(attdefs,i);