Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ftp --> resources, part II #2924

Merged
merged 1 commit into from
May 14, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
13 changes: 7 additions & 6 deletions INSTALL.md
Original file line number Diff line number Diff line change
Expand Up @@ -103,9 +103,10 @@ libraries. (And, optionally, the szlib library). Versions required are
at least HDF5 1.8.9, zlib 1.2.5, and curl 7.18.0 or later.
(Optionally, if building with szlib, get szip 2.0 or later.)

HDF5 1.8.9 and zlib 1.2.7 packages are available from the <a
href="ftp://ftp.unidata.ucar.edu/pub/netcdf/netcdf-4">netCDF-4 ftp
site</a>. If you wish to use the remote data client code, then you
These packages are available at:
https://resources.unidata.ucar.edu/netcdf/netcdf-4/

If you wish to use the remote data client code, then you
will also need libcurl, which can be obtained from the <a
href="http://curl.haxx.se/download.html">curl website</a>.

Expand Down Expand Up @@ -314,7 +315,7 @@ $ make check install
If parallel I/O access to netCDF classic, 64-bit offset, CDF-5 files is
also needed, the PnetCDF library should also be installed.
(Note: the previously recommended <a
href=ftp://ftp.unidata.ucar.edu/pub/netcdf/contrib/pnetcdf.h>replacement
href="https://resources.unidata.ucar.edu/netcdf/contrib/pnetcdf.h">replacement
pnetcdf.h</a> should no longer be used.) Then configure netCDF with the
"--enable-pnetcdf" option.

Expand Down Expand Up @@ -361,7 +362,7 @@ Note: --disable prefix indicates that the option is normally enabled.
<tr><td>--enable-netcdf-4<td>build with netcdf-4<td>HDF5 and zlib
<tr><td>--enable-netcdf4<td>synonym for enable-netcdf-4
<tr><td>--enable-hdf4<td>build netcdf-4 with HDF4 read capability<td>HDF4, HDF5 and zlib
<tr><td>--enable-hdf4-file-tests<td>test ability to read HDF4 files<td>selected HDF4 files from Unidata ftp site
<tr><td>--enable-hdf4-file-tests<td>test ability to read HDF4 files<td>selected HDF4 files from Unidata resources site
<tr><td>--enable-pnetcdf<td>build netcdf-4 with parallel I/O for classic, 64-bit offset, and CDF-5 files using PnetCDF
<tr><td>--enable-extra-example-tests<td>Run extra example tests<td>--enable-netcdf-4,GNU sed
<tr><td>--enable-parallel-tests <td>run extra parallel IO tests<td>--enable-netcdf-4, parallel IO support
Expand All @@ -384,7 +385,7 @@ Note: --disable prefix indicates that the option is normally enabled.
The benchmarks are a
bunch of extra tests, which are timed. We use these
tests to check netCDF performance.
<td>sample data files from the Unidata ftp site
<td>sample data files from the Unidata resources site
<tr><td>--disable-extreme-numbers
<td>don't use extreme numbers during testing, such as MAX_INT - 1<td>
<tr><td>--enable-dll<td>build a win32 DLL<td>mingw compiler
Expand Down
2 changes: 1 addition & 1 deletion cmake/dependencies.cmake
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,7 @@ if(NETCDF_ENABLE_HDF4)
message(STATUS "Found JPEG libraries: ${JPEG_LIB}")

# Option to enable HDF4 file tests.
option(NETCDF_ENABLE_HDF4_FILE_TESTS "Run HDF4 file tests. This fetches sample HDF4 files from the Unidata ftp site to test with (requires curl)." ON)
option(NETCDF_ENABLE_HDF4_FILE_TESTS "Run HDF4 file tests. This fetches sample HDF4 files from the Unidata resources site to test with (requires curl)." ON)
if(NETCDF_ENABLE_HDF4_FILE_TESTS)
find_program(PROG_CURL NAMES curl)
if(PROG_CURL)
Expand Down
6 changes: 3 additions & 3 deletions configure.ac
Original file line number Diff line number Diff line change
Expand Up @@ -354,9 +354,9 @@ AC_MSG_RESULT([$enable_dynamic_loading])


# Does the user want to turn on extra HDF4 file tests?
AC_MSG_CHECKING([whether to fetch some sample HDF4 files from Unidata ftp site to test HDF4 reading (requires wget)])
AC_MSG_CHECKING([whether to fetch some sample HDF4 files from Unidata resources site to test HDF4 reading (requires wget)])
AC_ARG_ENABLE([hdf4-file-tests], [AS_HELP_STRING([--enable-hdf4-file-tests],
[get some HDF4 files from Unidata ftp site and test that they can be read])])
[get some HDF4 files from Unidata resources site and test that they can be read])])
test "x$enable_hdf4" = xyes -a "x$enable_hdf4_file_tests" = xyes || enable_hdf4_file_tests=no
if test "x$enable_hdf4_file_tests" = xyes; then
AC_DEFINE([USE_HDF4_FILE_TESTS], 1, [If true, use use wget to fetch some sample HDF4 data, and then test against it.])
Expand Down Expand Up @@ -1096,7 +1096,7 @@ fi
AC_MSG_CHECKING([whether benchmarks should be run])
AC_ARG_ENABLE([benchmarks],
[AS_HELP_STRING([--enable-benchmarks],
[Run benchmarks. This will cause sample data files from the Unidata ftp
[Run benchmarks. This will cause sample data files from the Unidata resources
site to be fetched. The benchmarks are a bunch of extra tests, which
are timed. We use these tests to check netCDF performance.])])
test "x$enable_benchmarks" = xyes || enable_benchmarks=no
Expand Down
2 changes: 1 addition & 1 deletion docs/attribute_conventions.md
Original file line number Diff line number Diff line change
Expand Up @@ -115,7 +115,7 @@ It is strongly recommended that applicable conventions be followed unless there

`Conventions`

> If present, 'Conventions' is a global attribute that is a character array for the name of the conventions followed by the dataset. Originally, these conventions were named by a string that was interpreted as a directory name relative to the directory /pub/netcdf/Conventions/ on the now defunct host ftp.unidata.ucar.edu. The web page https://www.unidata.ucar.edu/netcdf/conventions.html is now the preferred and authoritative location for registering a URI reference to a set of conventions maintained elsewhere. Authors of new conventions should submit a request to support-netcdf@unidata.ucar.edu for listing on the Unidata conventions web page.
> If present, 'Conventions' is a global attribute that is a character array for the name of the conventions followed by the dataset. Originally, these conventions were named by a string that was interpreted as a directory name relative to the directory /pub/netcdf/Conventions/ on the now defunct ftp host. The web page https://www.unidata.ucar.edu/netcdf/conventions.html is now the preferred and authoritative location for registering a URI reference to a set of conventions maintained elsewhere. Authors of new conventions should submit a request to support-netcdf@unidata.ucar.edu for listing on the Unidata conventions web page.

<p>

Expand Down
2 changes: 1 addition & 1 deletion docs/static-pages/usage.html
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ <h1>Where is NetCDF Used?</h1>
recently made some use of netCDF, based on
<ol>
<li>
downloads from the Unidata site (ftp and http)
downloads from the Unidata downloads site
</li>
<li>
subscribers and posters to netCDF mailing lists
Expand Down
4 changes: 2 additions & 2 deletions hdf4_test/run_get_hdf4_files.sh
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ if test "x$srcdir" = x ; then srcdir=`pwd`; fi

# Get a file from the resources site; retry several times
getfile() {
DATAFILE="https://resources.unidata.ucar.edu/sample_data/hdf4/$1.gz"
DATAFILE="https://resources.unidata.ucar.edu/netcdf/sample_data/hdf4/$1.gz"

for try in 1 2 3 4 ; do # try 4 times

Expand All @@ -30,7 +30,7 @@ getfile() {

set -e
echo ""
echo "Getting HDF4 sample files from Unidata FTP site..."
echo "Getting HDF4 sample files from Unidata resources site..."

file_list="AMSR_E_L2_Rain_V10_200905312326_A.hdf AMSR_E_L3_DailyLand_V06_20020619.hdf \
MYD29.A2009152.0000.005.2009153124331.hdf MYD29.A2002185.0000.005.2007160150627.hdf \
Expand Down
2 changes: 1 addition & 1 deletion nc_perf/run_knmi_bm.sh
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ echo "Getting KNMI test files $file_list"
for f1 in $file_list
do
if ! test -f $f1; then
wget https://resources.unidata.ucar.edu/sample_data/$f1.gz
wget https://resources.unidata.ucar.edu/netcdf/sample_data/$f1.gz
gunzip $f1.gz
fi
done
Expand Down
Loading