Fix issues with Github pull request 187

(https://github.com/Unidata/netcdf-c/pull/187)
Primary problem was cmake build errors.
This commit is contained in:
Dennis Heimbigner 2016-01-08 12:55:11 -07:00
parent 5e4cbd2fec
commit 39e9cd0ffe
15 changed files with 374 additions and 70 deletions

View File

@ -20,4 +20,4 @@ before_install:
script:
- docker run --rm -it -h "$CURHOST" -e USEDASH=FALSE -e RUNF=OFF -e RUNCXX=OFF -e RUNP=OFF -e RUNNCO=OFF -e USECMAKE=$USECMAKE -e USEAC=$USEAC -e COPTS=$COPTS -v $(pwd):/netcdf-c $DOCKIMG
- docker run --rm -it -h "$CURHOST" -e USEDASH=FALSE -e RUNF=OFF -e RUNCXX=OFF -e RUNP=OFF -e RUNNCO=OFF -e USECMAKE=$USECMAKE -e USEAC=$USEAC -e COPTS=$COPTS -e CTEST_OUTPUT_ON_FAILURE=1 -v $(pwd):/netcdf-c $DOCKIMG

248
CONTRIBUTING.html Normal file
View File

@ -0,0 +1,248 @@
<html>
<body>
<img src="http://www.unidata.ucar.edu/images/logos/thredds_tds-150x150.png"/>
<h1>Welcome contributors!</h1>
First off, thank you for your interest in contributing to the THREDDS project!
This repository contains the code for both netCDF-Java and the THREDDS Data Server (TDS) projects.
The other projects held under the THREDDS umbrella are <a href="https://github.com/unidata/rosetta>Rosetta</a> and the latest addition, <a href="https://github.com/unidata/siphon>Siphon</a> (a python interface to the TDS).
<h2>Process Overview</h2>
<ul>
<li> <a href="#gh-setup">GitHub Setup</a>
<ul>
<li> <a href="#gh-join">Join Github!</a>
<li> <a href="#gh-fork">Fork the Unidata THREDDS project</a>
<li> <a href="#gh-pull-ud-thredds">Pull down local copy of the Unidata THREDDS project</a>
<li> <a href="#gh-pull-personal-thredds">Add and pull down a local copy of your THREDDS project fork</a>
</ul>
<li> <a href="#gh-contrib-workflow">Contribution workflow</a>
<ul>
<li> <a href="#gh-sync-ud">Make sure you have the latest changes from Unidata THREDDS repository</a>
<li> <a href="#gh-branch">Make a new branch for your work and start hacking</a>
<li> <a href="#gh-history-cleanup">Clean up your git commit history</a>
<li> <a href="#gh-final-commit-for-pr">Push changes to your fork to use for the pull request</a>
<li> <a href="#gh-pr">Make the pull request</a>
</ul>
<li> <a href="#gh-now-what">Now what?</a>
</ul>
</ul>
<h2><a name="gh-setup">GitHub Setup</a></h2>
<h3><a name="gh-join">Join Github!</a></h3>
To get started contributing to the THREDDS project, the first thing you should do is <a href="https://github.com/join">signup for a free account on GitHub</a>.
<h3><a name="gh-fork">Fork the Unidata THREDDS project</a></h3>
Once you have an account, go ahead and <a href="https://github.com/unidata/thredds#fork-destination-box">fork</a> the THREDDS project.
By forking the project, you will have a complete copy of the THREDDS project, history and all, under your personal account.
This will allow you to make pull requests against the Unidata THREDDS repository, which is the primairy mechanism used to add new code to the project (even Unidata developers do this!).
<h3><a name="gh-pull-ud-thredds">Pull down local copy of the Unidata THREDDS project</a></h3>
After cloning the Unidata repository, you can pull down the source code to your local machine by using git:
<pre>git clone --origin unidata git@github.com:Unidata/thredds.git (for ssh)</pre>
or
<pre>git clone --origin unidata https://github.com/Unidata/thredds.git (for http)</pre>
Note that these commands reference the Unidata repository.
<p>
Normally in git, the remote repository you clone from is automatically named 'origin'.
To help with any confusion when making pull requests, this commands above rename the remote repository to 'unidata'.
<h3><a name="gh-pull-personal-thredds">Add and pull down a local copy of your THREDDS project fork</a></h3>
Next, move into the source directory git has created, and add your personal fork of the THREDDS code as a remote"
<pre>git clone --origin me git@github.com:<my-github-user-name>/thredds.git (for ssh)</pre>
or
<pre>git clone --origin me https://github.com/,my-github-user-name>/thredds.git (for http)</pre>
Now you are all set!
<h2><a name="gh-contrib-workflow">Contribution workflow</a></h2>
<h3><a name="gh-sync-ud">Make sure you have the latest changes from Unidata THREDDS repository</a></h3>
First, make sure you have the most recent changes to the THREDDS code by using git pull:
<pre>git pull unidata master</pre>
<h3><a name="gh-branch">Make a new branch for your work and start hacking</a></h3>
Next, make a new branch where you will actually do the hacking:
<pre>git checkout -b mywork</pre>
As of this point, the branch 'mywork' is local.
To make this branch part of your personal GitHub Remote repository, use the following command:
<pre>git push -u me mywork</pre>
Now git (on your local machine) is setup to a point where you can start hacking on the code and commit changes to your personal GitHub repository.
At any point, you may add commits to your local copy of the repository by using:
<pre>git commit</pre>
If you would like these changes to be stored on your personal remote repository, simply use:
<pre>git push me mywork</pre>
Once you are satisified with your work, there is one last step to complete before submitting the pull request - clean up the history.
<h3><a name="gh-history-cleanup">Clean up your git commit history</a></h3>
Commit history can often be full of temporiariy commit messages, of commits with code changes that ultimately didn't make the final cut.
<p>
To clean up your history, use the <pre>git rebase -i</pre> command, which will open an editor:
<pre>
sarms@flip: [mywork] git rebase -i
pick 083508e first commit of my really cool feature or bug fix!
pick 9bcba01 Oops missed this one thing. This commit fixes that.
# Rebase 083508e..9bcba01 onto 083508e (2 command(s))
#
# Commands:
# p, pick = use commit
# r, reword = use commit, but edit the commit message
# e, edit = use commit, but stop for amending
# s, squash = use commit, but meld into previous commit
# f, fixup = like "squash", but discard this commit</pre>message
# x, exec = run command (the rest of the line) using shell
# d, drop = remove commit
#
# These lines can be re-ordered; they are executed from top to bottom.
#
# If you remove a line here THAT COMMIT WILL BE LOST.
#
# However, if you remove everything, the rebase will be aborted.
#
# Note that empty commits are commented out
<pre>
Based on my commit messages, you can see that commit </pre>1' fixed a mistake from my first commit.
It would be nice to 'squash' those changes into the first commit, so that the official history does not show my mistake..uhhh...this extra commit.
To do so, edit the text to change the second commits 'pick' to 'squash':
<pre>h
pick 083508e first commit of my really cool feature or bug fix!
squash 9bcba01 Oops missed this one thing. This commit fixes that.
# Rebase 083508e..9bcba01 onto 083508e (2 command(s))
#
# Commands:
# p, pick = use commit
# r, reword = use commit, but edit the commit message
# e, edit = use commit, but stop for amending
# s, squash = use commit, but meld into previous commit
# f, fixup = like "squash", but discard this commit</pre>message
# x, exec = run command (the rest of the line) using shell
# d, drop = remove commit
#
# These lines can be re-ordered; they are executed from top to bottom.
#
# If you remove a line here THAT COMMIT WILL BE LOST.
#
# However, if you remove everything, the rebase will be aborted.
#
# Note that empty commits are commented out
<pre>
Once you have marked the commits to be squashed and exited the edit, you will prompted to change the commit message for the new, squashed, mega commit:
</pre>
# This is a combination of 2 commits.
# The first commit's message is:
first commit of my really cool feature or bug fix!
# This is the 2nd commit message:
Oops missed this one thing. This commit fixes that.
#Please enter the commit message for your changes. Lines starting
# with '#' will be ignored, and an empty message aborts the commit.
#
# Date: Thu Oct 15 09:59:23 2015 -0600
#
# interactive rebase in progress; onto 083508e
# Last commands done (2 commands done):
# pick 09134d5 first commit of my really cool feature or bug fix!
# squash 9bcba01 Oops missed this one thing. This commit fixes that.
# No commands remaining.
# You are currently editing a commit while rebasing branch 'mywork' on '0835 08e'.
#
# Changes to be committed:
...
<pre>
Edit the two commit messages into a single message that describes the overall change:
</pre>
Once you have and exit, you will have a change to change the commit message for the new, squashed, mega commit:
<pre>h
Really cool feature or bug fix. Addresses the github issue Unidata/thredds#1
#Please enter the commit message for your changes. Lines starting
# with </pre>l be ignored, and an empty message aborts the commit.
#
# Date: Thu Oct 15 09:59:23 2015 -0600
#
# interactive rebase in progress; onto 083508e
# Last commands done (2 commands done):
# pick 09134d5 first commit of my really cool feature or bug fix!
# squash 9bcba01 Oops missed this one thing. This commit fixes that.
# No commands remaining.
# You are currently editing a commit while rebasing branch 'mywork' on '0835 08e'.
#
# Changes to be committed:
...
<pre>
Now, when you look at your git commit logs, you will see:
</pre>
commit 805b4723c4a2cbbed240354332cd7af57559a1b9
Author: Sean Arms <sarms@ucar.edu>
Date: Thu Oct 15 09:59:23 2015 -0600
Really cool feature or bug fix. Addresses the github issue Unidata/thredds#1
<pre>
Note that the commit conains the text </pre>a/thredds#1'.
This is a cool github trick that will allow you to reference GitHub issues within your commit messages.
When viewed on github.com, this will be turned into a hyperlink to the issue.
While not every contribution will address an issue, please use this feature if your contribution does!
<h3><a name="gh-final-commit-for-pr">Push changes to your fork to use for the pull request</a></h3>
Now that you have cleaned up the history, you will need to make a final push to your personal GitHub repository.
However, the rebase has changed the history of your local branch, which means you will need to use the '--force' flag in your push:
<pre>ush --force me mywork</pre>
<h3><a name="gh-pr">Make the pull request</a></h3>
Finally, go to your personal remote repository on github.com and switch to your 'mywork' branch.
Once you are on your work branch, you will see a button that says "Pull request", which will allow you to make a pull request.
The github pull request page will allow you to select which repository and branch you would like to submit the pull request to (the 'base fork', which should be 'Unidata/thredds', and 'base', which should be 'master'), as well as the 'head fork' and 'compare' (which should be '<github-user-name/thredds>' and 'mywork', respectivly).
Once this is setup, you can make the pull request.
<h2><a name="gh-now-what">Now what?</a></h2>
The Unidata THREDDS project is setup such that automated testing for all pull requests is done via TravisCI.
The status of the tests can be seen on the pull request page.
For example, see <a href="https://github.com/Unidata/thredds/pull/231">Unidata/thredds#231</a> by selecting the 'View Details' button.
This pull request was tested on <a href="https://travis-ci.org/Unidata/thredds/builds/84433104">TravisCI</a> and passed on all versions of Java supported by the current master branch.
We have setup the THREDDS repository such that changes that do not pass these tests cannot be merged.
One of the Unidata THREDDS team members will work with you to make sure your work is ready for merging once the tests have passed on TravisCI.
Note that as changes to your pull request may be required, you can simply push thos changes to your personal GitHub repository and the pull request will automatically be updated and new TravisCI tests will be initiated.
If your pull request addresses a bug, we kindly ask that you include a test in your pull request.
If you do not know how to write tests in Java, we will be more than happy to work with you!
</body>
</html>

6
cf
View File

@ -2,9 +2,9 @@
#NB=1
DB=1
#X=-x
FAST=1
#FAST=1
HDF5=1
#HDF5=1
DAP=1
#PNETCDF=1
#PAR4=1
@ -116,7 +116,7 @@ FLAGS="$FLAGS --disable-examples"
#FLAGS="$FLAGS --disable-dap-remote-tests"
FLAGS="$FLAGS --enable-dap-auth-tests"
#FLAGS="$FLAGS --enable-doxygen"
#FLAGS="$FLAGS --enable-logging"
FLAGS="$FLAGS --enable-logging"
#FLAGS="$FLAGS --disable-diskless"
#FLAGS="$FLAGS --enable-mmap"
#FLAGS="$FLAGS --with-udunits"

View File

@ -1,29 +1,62 @@
rm -fr build
mkdir build
cd build
# Is visual studio being used?
#VS=yes
CYGWIN=yes
#export CC=mpicc
export CC=gcc
if test "x$VS" = x ; then
#CC=mpicc
CC=gcc
fi
for p in /usr/local/lib /usr/lib ; do
if test -f $p/libz.so ; then ZP=$p; fi
if test -f $p/libhdf5.so ; then HP=$p; fi
if test -f $p/libcurl.so ; then CP=$p; fi
export CC
if test "x$VS" != x -a "x$CYGWIN" != x ; then
ZLIB=cygz.dll; H5LIB=cyghdf5.dll; H5LIB_HL=cyghdf5_hl.dll; CURLLIB=cygcurl.dll
elif test "x$VS" = x -a "x$CYGWIN" != x ; then
ZLIB=libz.dll.a; H5LIB=libhdf5.dll.a; H5LIB_HL=libhdf5_hl.dll.a; CURLLIB=libcurl.dll.a
elif test "x$VS" = x -a "x$CYGWIN" == x ; then
ZLIB=libz.so; H5LIB=libhdf5.so; H5LIB_HL=libhdf5_hl.so; CURLLIB=libcurl.so
else
echo "cannot determine library names"
exit 1
fi
for p in /usr/bin /usr/local/bin /usr/local/lib /usr/lib ; do
if test -f $p/$ZLIB ; then ZP=$p; fi
if test -f $p/$H5LIB ; then HP=$p; fi
if test -f $p/$CURLLIB ; then CP=$p; fi
done
ZLIB="-DZLIB_LIBRARY=${ZP}/libz.so -DZLIB_INCLUDE_DIR=${ZP}/include -DZLIB_INCLUDE_DIRS=${ZP}/include"
HDF5="-DHDF5_LIB=${HP}/libhdf5.so -DHDF5_HL_LIB=${HP}/libhdf5_hl.so -DHDF5_INCLUDE_DIR=${HP}/include"
CURL="-DCURL_LIBRARY=${CP}/libcurl.so -DCURL_INCLUDE_DIR=${CP}/include -DCURL_INCLUDE_DIRS=${CP}/include"
if test "x$ZP" = x ; then echo "Cannot find z lib" ; exit 1; fi
if test "x$HP" = x ; then echo "Cannot find hdf5 lib" ; exit 1; fi
if test "x$CP" = x ; then echo "Cannot find curl lib" ; exit 1; fi
FLAGS="$FLAGS -DENABLE_DAP=false"
FLAGS="$FLAGS -DENABLE_NETCDF_4=false"
if test "x$CYGWIN" != x -a "x$VS" != x; then
ZP=`cygpath -w "$ZP"`
HP=`cygpath -w "$HP"`
CP=`cygpath -w "$CP"`
fi
FLAGS="$FLAGS -DCMAKE_INSTALL_PREFIX=/usr/local"
#if test "x$VS" != x ; then USR=c:/cygwin/usr; else USR=/usr; fi
ZLIB="-DZLIB_LIBRARY=${ZP}/$ZLIB -DZLIB_INCLUDE_DIR=${ZP}/include -DZLIB_INCLUDE_DIRS=${ZP}/include"
HDF5="-DHDF5_LIB=${HP}/$H5LIB -DHDF5_HL_LIB=${HP}/$H5LIB_HL -DHDF5_INCLUDE_DIR=${HP}/include"
CURL="-DCURL_LIBRARY=${CP}/$CURLLIB -DCURL_INCLUDE_DIR=${CP}/include -DCURL_INCLUDE_DIRS=${CP}/include"
FLAGS="$FLAGS -DCMAKE_C_FLAGS='-Wall -Wno-unused-but-set-variable -Wno-unused-variable -Wno-unused-parameter'"x2
#FLAGS="$FLAGS -DENABLE_DAP=false"
#FLAGS="$FLAGS -DENABLE_NETCDF_4=false"
FLAGS="$FLAGS -DCMAKE_INSTALL_PREFIX=$USR/local"
#FLAGS="-DCMAKE_PREFIX_PATH=$PPATH"
#FLAGS="$FLAGS -DCMAKE_PREFIX_PATH=$PPATH"
FLAGS="$FLAGS -DENABLE_DAP_REMOTE_TESTS=true"
#FLAGS="$FLAGS -DENABLE_DAP_AUTH_TESTS=true"
rm -fr build
mkdir build
cd build
cmake $FLAGS ${ZLIB} ${HDF5} ${CURL} ..
cmake --build .
cmake --build . --target test
CTEST_OUTPUT_ON_FAILURE=1 cmake --build . --target test

View File

@ -107,7 +107,7 @@ ENDIF()
add_sh_test(ncdump tst_formatx3)
add_sh_test(ncdump tst_bom)
add_bin_test(ncdump tst_dimsizes)
add_sh_test(ncdump tst_dimsizes)
# The following test script invokes
# gcc directly.

View File

@ -1,9 +1,10 @@
# Test c output
T=tst_dimsizes
#CMD=valgrind --leak-check=full
CMD=gdb --args
#CMD=gdb --args
#MPI=1
LLP="LD_LIBRARY_PATH=/usr/local/lib"
ifndef MPI
CC=gcc
@ -15,7 +16,7 @@ LDFLAGS=../liblib/.libs/libnetcdf.a -L/usr/local/lib -lhdf5_hl -lhdf5 -lz ../li
endif
all::
export CFLAGS; export LDFLAGS; \
export ${LLP}; export CFLAGS; export LDFLAGS; \
${CC} -o $T.exe ${CFLAGS} ${T}.c ${LDFLAGS}; \
${CMD} ./$T.exe

View File

@ -141,7 +141,8 @@ ref_tst_ncf213.cdl tst_h_scalar.sh \
run_utf8_nc4_tests.sh \
tst_formatx3.sh tst_formatx4.sh ref_tst_utf8_4.cdl \
tst_inttags.sh tst_inttags4.sh \
CMakeLists.txt XGetopt.c tst_bom.sh tst_inmemory_nc3.sh tst_inmemory_nc4.sh
CMakeLists.txt XGetopt.c tst_bom.sh tst_inmemory_nc3.sh \
tst_dimsizes.sh tst_inmemory_nc4.sh
# CDL files and Expected results
SUBDIRS=cdl expected

View File

@ -32,47 +32,47 @@ main(int argc, char **argv)
printf("\n*** Testing Max Dimension Sizes\n");
printf("\n|size_t|=%d\n",sizeof(size_t));
printf("\n|size_t|=%lu\n",(unsigned long)sizeof(size_t));
printf("\n*** Writing Max Dimension Size For NC_CLASSIC\n");
if ((stat=nc_create(FILECLASSIC, NC_CLOBBER, &ncid))) ERR;
if ((stat=nc_create(FILECLASSIC, NC_CLOBBER, &ncid))) ERRSTAT(stat);
dimsize = DIMMAXCLASSIC;
if ((stat=nc_def_dim(ncid, "testdim", dimsize, &dimid))) ERR;
if ((stat=nc_close(ncid))) ERR;
if ((stat=nc_def_dim(ncid, "testdim", dimsize, &dimid))) ERRSTAT(stat);
if ((stat=nc_close(ncid))) ERRSTAT(stat);
printf("\n*** Reading Max Dimension Size For NC_CLASSIC\n");
if ((stat=nc_open(FILECLASSIC, NC_NOCLOBBER, &ncid))) ERR;
if ((stat=nc_inq_dimid(ncid, "testdim", &dimid))) ERR;
if ((stat=nc_inq_dimlen(ncid, dimid, &dimsize))) ERR;
if ((stat=nc_open(FILECLASSIC, NC_NOCLOBBER, &ncid))) ERRSTAT(stat);
if ((stat=nc_inq_dimid(ncid, "testdim", &dimid))) ERRSTAT(stat);
if ((stat=nc_inq_dimlen(ncid, dimid, &dimsize))) ERRSTAT(stat);
if(dimsize != DIMMAXCLASSIC) ERR;
if ((stat=nc_close(ncid))) ERR;
if ((stat=nc_close(ncid))) ERRSTAT(stat);
printf("\n*** Writing Max Dimension Size For NC_64BIT_OFFSET\n");
if ((stat=nc_create(FILE64OFFSET, NC_CLOBBER | NC_64BIT_OFFSET, &ncid))) ERR;
if ((stat=nc_create(FILE64OFFSET, NC_CLOBBER | NC_64BIT_OFFSET, &ncid))) ERRSTAT(stat);
dimsize = DIMMAX64OFFSET;
if ((stat=nc_def_dim(ncid, "testdim", dimsize, &dimid))) ERR;
if ((stat=nc_close(ncid))) ERR;
if ((stat=nc_def_dim(ncid, "testdim", dimsize, &dimid))) ERRSTAT(stat);
if ((stat=nc_close(ncid))) ERRSTAT(stat);
printf("\n*** Reading Max Dimension Size For NC_64BIT_OFFSET\n");
if ((stat=nc_open(FILE64OFFSET, NC_NOCLOBBER|NC_64BIT_OFFSET, &ncid))) ERR;
if ((stat=nc_inq_dimid(ncid, "testdim", &dimid))) ERR;
if ((stat=nc_inq_dimlen(ncid, dimid, &dimsize))) ERR;
if ((stat=nc_open(FILE64OFFSET, NC_NOCLOBBER|NC_64BIT_OFFSET, &ncid))) ERRSTAT(stat);
if ((stat=nc_inq_dimid(ncid, "testdim", &dimid))) ERRSTAT(stat);
if ((stat=nc_inq_dimlen(ncid, dimid, &dimsize))) ERRSTAT(stat);
if(dimsize != DIMMAX64OFFSET) ERR;
if ((stat=nc_close(ncid))) ERR;
if ((stat=nc_close(ncid))) ERRSTAT(stat);
if(sizeof(size_t) == 8) {
printf("\n*** Writing Max Dimension Size For NC_64BIT_DATA\n");
if ((stat=nc_create(FILE64DATA, NC_CLOBBER | NC_64BIT_DATA, &ncid))) ERR;
if ((stat=nc_create(FILE64DATA, NC_CLOBBER | NC_64BIT_DATA, &ncid))) ERRSTAT(stat);
dimsize = (size_t)DIMMAX64DATA;
if ((stat=nc_def_dim(ncid, "testdim", dimsize, &dimid))) ERR;
if ((stat=nc_close(ncid))) ERR;
if ((stat=nc_def_dim(ncid, "testdim", dimsize, &dimid))) ERRSTAT(stat);
if ((stat=nc_close(ncid))) ERRSTAT(stat);
printf("\n*** Reading Max Dimension Size For NC_64BIT_DATA\n");
if ((stat=nc_open(FILE64DATA, NC_NOCLOBBER|NC_64BIT_DATA, &ncid))) ERR;
if ((stat=nc_inq_dimid(ncid, "testdim", &dimid))) ERR;
if ((stat=nc_inq_dimlen(ncid, dimid, &dimsize))) ERR;
if ((stat=nc_open(FILE64DATA, NC_NOCLOBBER|NC_64BIT_DATA, &ncid))) ERRSTAT(stat);
if ((stat=nc_inq_dimid(ncid, "testdim", &dimid))) ERRSTAT(stat);
if ((stat=nc_inq_dimlen(ncid, dimid, &dimsize))) ERRSTAT(stat);
if(dimsize != DIMMAX64DATA) ERR;
if ((stat=nc_close(ncid))) ERR;
if ((stat=nc_close(ncid))) ERRSTAT(stat);
}
SUMMARIZE_ERR;

24
ncdump/tst_dimsizes.sh Normal file → Executable file
View File

@ -1,5 +1,9 @@
#!/bin/sh
echo "*** Test Maximum dimension sizes X mode"
set -x
if test "x$SETX" = x1 ; then echo "file=$0"; set -x ; fi
# This shell script tests max dimension sizes X mode
@ -21,7 +25,7 @@ echo "*** Generate: tst_dimsize_classic.nc tst_dimsize_64offset.nc tst_dimsize_6
echo "*** Verify that ncdump can read dimsizes"
rm -fr ./tmp
if test ../ncdump/ncdump -h tst_dimsize_classic.nc > ./tmp ; then
if ../ncdump/ncdump -h tst_dimsize_classic.nc > ./tmp ; then
echo "*** PASS: ncdump tst_dimsize_classic.nc"
else
echo "*** FAIL: ncdump tst_dimsize_classic.nc"
@ -29,23 +33,25 @@ RETURN=1
fi
rm -fr ./tmp
if test ../ncdump/ncdump -h tst_dimsize_64offset.nc > ./tmp ; then
if ../ncdump/ncdump -h tst_dimsize_64offset.nc > ./tmp ; then
echo "*** PASS: ncdump tst_dimsize_64offset.nc"
else
echo "*** FAIL: ncdump tst_dimsize_64offset.nc"
RETURN=1
fi
rm -fr ./tmp
if test ../ncdump/ncdump -h tst_dimsize_64data.nc > ./tmp ; then
echo "*** PASS: ncdump tst_dimsize_64data.nc"
else
echo "*** FAIL: ncdump tst_dimsize_64data.nc"
RETURN=1
if test -f tst_dimsize_64data.nc ; then
rm -fr ./tmp
if ../ncdump/ncdump -h tst_dimsize_64data.nc > ./tmp ; then
echo "*** PASS: ncdump tst_dimsize_64data.nc"
else
echo "*** FAIL: ncdump tst_dimsize_64data.nc"
RETURN=1
fi
fi
# Cleanup
rm -f tst_dimsize_classic.nc tst_dimsize_64offset.nc tst_dimsize_64data.nc
rm -f tmp tst_dimsize_classic.nc tst_dimsize_64offset.nc tst_dimsize_64data.nc
exit $RETURN

View File

@ -47,6 +47,11 @@ gen_netcdf(const char *filename)
ngrps = listlength(grpdefs);
#endif /*USE_NETCDF4*/
/* Turn on logging */
#ifdef LOGGING
nc_set_log_level(ncloglevel);
#endif
/* create netCDF file, uses NC_CLOBBER mode */
cmode_modifier |= NC_CLOBBER;
#ifdef USE_NETCDF4
@ -255,7 +260,8 @@ Generate type definitions
static void
genbin_deftype(Symbol* tsym)
{
int i,stat;
unsigned long i;
int stat;
ASSERT(tsym->objectclass == NC_TYPE);
switch (tsym->subclass) {
@ -321,7 +327,7 @@ genbin_deftype(Symbol* tsym)
efield->typ.basetype->ncid);
} else {
int j;
int dimsizes[NC_MAX_VAR_DIMS];
int dimsizes[NC_MAX_VAR_DIMS]; /* int because inside compound */
/* Generate the field dimension constants*/
for(j=0;j<efield->typ.dimset.ndims;j++) {
unsigned int size = efield->typ.dimset.dimsyms[j]->dim.declsize;

View File

@ -159,6 +159,7 @@ extern int cdf5_flag; /* 1 => cdf-5 unsigned types in the parse */
extern int specials_flag; /* 1 => special attributes are present */
extern int usingclassic; /* 1 => k_flag == 1|2|5 */
extern int k_flag;
extern int ncloglevel;
/* Global data */

View File

@ -50,7 +50,7 @@ getfiller(Symbol* tvsym)
static void
fill(Symbol* tsym, Datalist* filler)
{
int i;
unsigned long i;
NCConstant con = nullconstant;
Datalist* sublist;
@ -165,7 +165,7 @@ nc_getfill(NCConstant* value)
case NC_UINT64: value->value.uint64v = NC_FILL_UINT64; break;
case NC_STRING:
value->value.stringv.stringv = nulldup(NC_FILL_STRING);
value->value.stringv.len = strlen(NC_FILL_STRING);
value->value.stringv.len = (int)strlen(NC_FILL_STRING);
/* Exception: if string is null, then make it's length be 1 */
if(value->value.stringv.len == 0)
value->value.stringv.len = 1;

View File

@ -45,8 +45,8 @@ int cdf5_flag; /* 1 => cdf5 | maybe netcdf-4 */
int specials_flag; /* 1=> special attributes are present */
int usingclassic;
int cmode_modifier;
int diskless;
int ncloglevel;
char* binary_ext = ".nc";
@ -236,15 +236,19 @@ main(
enhanced_flag = 0;
cdf5_flag = 0;
specials_flag = 0;
diskless = 0;
#ifdef LOGGING
ncloglevel = NC_TURN_OFF_LOGGING;
#else
ncloglevel = -1;
#endif
#if _CRAYMPP && 0
/* initialize CRAY MPP parallel-I/O library */
(void) par_io_init(32, 32);
#endif
while ((c = getopt(argc, argv, "134567bB:cdD:fhHk:l:M:no:Pv:x")) != EOF)
while ((c = getopt(argc, argv, "134567bB:cdD:fhHk:l:M:no:Pv:xL:")) != EOF)
switch(c) {
case 'd':
debug = 1;
@ -304,6 +308,9 @@ main(
return(1);
}
}; break;
case 'L':
ncloglevel = atoi(optarg);
break;
case 'n': /* old version of -b, uses ".cdf" extension */
if(l_flag != 0) {
fprintf(stderr,"Please specify only one language\n");
@ -465,7 +472,7 @@ main(
return 1;
case '\xEF':
/* skip the BOM */
fread(bom,1,1,fp);
(void)fread(bom,1,1,fp);
break;
default: /* legal printable char, presumably; rewind */
rewind(fp);

View File

@ -10,7 +10,7 @@
typedef struct Alignment {
char* typename;
int alignment;
unsigned int alignment;
} Alignment;
/* Define indices for every primitive C type */

View File

@ -174,7 +174,7 @@ Return NULL if symbol is not unique or not found at all.
static Symbol*
uniquetreelocate(Symbol* refsym, Symbol* root)
{
int i;
unsigned long i;
Symbol* sym = NULL;
/* search the root for matching name and major type*/
sym = lookupingroup(refsym->objectclass,refsym->name,root);
@ -200,7 +200,7 @@ Compute the fqn for every top-level definition symbol
static void
computefqns(void)
{
int i,j;
unsigned long i,j;
/* Groups first */
for(i=0;i<listlength(grpdefs);i++) {
Symbol* sym = (Symbol*)listget(grpdefs,i);
@ -259,7 +259,8 @@ computefqns(void)
static void
processtypes(void)
{
int i,j,keep,added;
unsigned long i,j;
int keep,added;
List* sorted = listnew(); /* hold re-ordered type set*/
/* Prime the walk by capturing the set*/
/* of types that are dependent on primitive types*/
@ -352,7 +353,7 @@ static int
tagvlentypes(Symbol* tsym)
{
int tagged = 0;
int j;
unsigned long j;
switch (tsym->subclass) {
case NC_VLEN:
tagged = 1;
@ -385,7 +386,7 @@ filltypecodes(void)
static void
processenums(void)
{
int i,j;
unsigned long i,j;
List* enumids = listnew();
for(i=0;i<listlength(typdefs);i++) {
Symbol* sym = (Symbol*)listget(typdefs,i);
@ -419,7 +420,7 @@ processenums(void)
static void
processeconstrefs(void)
{
int i;
unsigned long i;
/* locate all the datalist and walk them recursively */
for(i=0;i<listlength(attdefs);i++) {
Symbol* att = (Symbol*)listget(attdefs,i);