Skip to content

Commit

Permalink
Merge pull request #43 from nationalparkservice/irma_api_v6_dev
Browse files Browse the repository at this point in the history
Irma api v6 dev
  • Loading branch information
RobLBaker authored Sep 5, 2023
2 parents bd1a1aa + 78ea476 commit 7fc4479
Show file tree
Hide file tree
Showing 33 changed files with 181 additions and 127 deletions.
6 changes: 3 additions & 3 deletions DESCRIPTION
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
Package: NPSutils
Type: Package
Title: Collection of Functions to read/write information from the NPS Data Store
Version: 0.2.0.3
Version: 0.3.0
Authors@R: c(
person(given = "Robert", family = "Baker", email = "robert_baker@nps.gov",
role = c("aut", "cre"),
Expand Down Expand Up @@ -36,5 +36,5 @@ Suggests:
knitr,
rmarkdown
VignetteBuilder: knitr
URL: https://github.com/RobLBaker/NPSutils
BugReports: https://github.com/RobLBaker/NPSutils/issues
URL: https://github.com/nationalparkservice/NPSutils
BugReports: https://github.com/nationalparkservice/NPSutils/issues
24 changes: 14 additions & 10 deletions NEWS.md
Original file line number Diff line number Diff line change
@@ -1,16 +1,20 @@
* added new functionality to `get_data_packages()`: it will now check to see if a DataStore reference ID is invalid or not. It will also check whether the reference is a data package or not. Substantial feedback is reported to the user if the flag force is set to FALSE.
* added a new function `rm_local_packages()` which can delete one or more (or all) packages within a folder called "data" (where your packages should be if you downloaded them with `get_data_packages()`). This only deletes local copies and does not delete any data from DataStore.
* changed function name `load_metadata()` is now `load_pkg_metadata()` so as not to conflict with `DPchecker::load_metadata()`
* updated all function names to be snake_case
* updated all file names to match function names
* updated `get_data_packages()` to (when `force = FALSE`) check for newer versions of requested data packages and prompt the user to download the newest version if they so choose.
* updated `get_data_packages()` to:
# NPSutils 0.3.0

* updated all datastore api requests from v4/v5 to v6 (units service remains at v2)
* add global variables for base datastore api urls and helper functions to access them in utils.R
* added new functionality to `get_data_packages()`: it will now check to see if a DataStore reference ID is invalid or not. It will also check whether the reference is a data package or not. Substantial feedback is reported to the user if the flag force is set to FALSE.
* added a new function `rm_local_packages()` which can delete one or more (or all) packages within a folder called "data" (where your packages should be if you downloaded them with `get_data_packages()`). This only deletes local copies and does not delete any data from DataStore.
* changed function name `load_metadata()` is now `load_pkg_metadata()` so as not to conflict with `DPchecker::load_metadata()`
* updated all function names to be snake_case
* updated all file names to match function names
* updated `get_data_packages()` to (when `force = FALSE`) check for newer versions of requested data packages and prompt the user to download the newest version if they so choose.
* updated `get_data_packages()` to:
* include a force option. Force defaults to false for a verbose & interactive function. Setting force = TRUE removes the interactive portions and eliminates all informative messages except critical errors.
* `get_data_packages()` now inspects the location the data packages are being written to. If a folder with the data package reference id already exists, the function will prompt the user asking if they want to re-download and overwrite the existing data package (when force = FALSE)
* `get_data_packages()` now returns the full path to the data package folders, including the "/data" directory they are all in.
* update `get_park_taxon_refs()` to hit v5/rest services
* update documentation: make it clear that the taxon_code parameter in `get_park_taxon_refs()` is the IRMA taxon code, not the ITIS TSN.
* update documentation: make it explicit when `get_park_taxon_refs()` and `get_park_taxon_citations()` are hitting secure servers and warn users that the results may also need to be restricted.
* update `get_park_taxon_refs()` to hit v5/rest services
* update documentation: make it clear that the taxon_code parameter in `get_park_taxon_refs()` is the IRMA taxon code, not the ITIS TSN.
* update documentation: make it explicit when `get_park_taxon_refs()` and `get_park_taxon_citations()` are hitting secure servers and warn users that the results may also need to be restricted.

# NPSutils 0.2.0.3
* added `map_wkt()` function to map points, polygons, or both from Well Known Text coordinates (WKT). WKT is used in place to GPS coordinates when sensitive species locations have been "fuzzed". In this case, providing a polygon rather than the an exact (albeit fuzzed) is preferable as it is clear that the location is not exact. WKT is an efficient way to store geographic shapes such as polygons in flat files such as .csv.
Expand Down
17 changes: 8 additions & 9 deletions R/getReferenceInfo.R
Original file line number Diff line number Diff line change
Expand Up @@ -18,11 +18,11 @@
#' get_park_taxon_refs("APIS", 126749)
#' }
get_park_taxon_refs <- function(park_code, taxon_code) {
url <- paste(
"https://irmaservices.nps.gov/datastore-secure/v5/rest/UnitSpeciesSearch/",
park_code, "/", taxon_code,
sep = ""
)
url <- paste0(.ds_secure_api(),
"UnitSpeciesSearch/",
park_code,
"/",
taxon_code)
DSReference <- httr::content(httr::GET(
url, httr::authenticate(":", ":", "ntlm")
)) %>%
Expand Down Expand Up @@ -97,10 +97,9 @@ get_park_taxon_url <- function(park_code, taxon_code) {
#' get_ref_info(2266196, "Title")
#' }
get_ref_info <- function(holding_id, field) {
url <- paste0(
"https://irmaservices.nps.gov/datastore/v4/rest/Profile/",
holding_id
)
url <- paste0(.ds_api(),
"Profile/",
holding_id)
DSReference <- httr::content(httr::GET(
url, httr::authenticate(":", ":", "ntlm")
))
Expand Down
43 changes: 22 additions & 21 deletions R/get_data_packages.R
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
#'
#' @description `get_data_packages()` creates a directory called "data" in the current working directory (or user-specified location, see the `path` option). For each data package, it writes a new sub-directory within the "data" directory named with the corresponding data package reference ID. All the data package files are then copied to the directory specific to the data package.
#'
#' @detail In the default mode (force = FALSE), `get_data_packages()` will inform a user if the directory for that data package already exists and give the user the option to overwrite it (or skip downloading). In the default mode (force = FALSE), `get_data_packages()` will also check to see if there are newer versions of the requested data package and if there are newer versions, `get_data_packages()` will inform the user of when the requested data package was issued, when the newest data package was issued, and will then ask the user if they would like to download the newest version instead of the requested version. In the default mode (force = FALSE), `get_data_packages()` will warn a user if the reference ID supplied was not found (does not exist or requires VPN) and if a reference ID refers to a product that is not a data package, `get_data_packages()` will ask if the user wants to attempt to download it anyway.
#' @details In the default mode (force = FALSE), `get_data_packages()` will inform a user if the directory for that data package already exists and give the user the option to overwrite it (or skip downloading). In the default mode (force = FALSE), `get_data_packages()` will also check to see if there are newer versions of the requested data package and if there are newer versions, `get_data_packages()` will inform the user of when the requested data package was issued, when the newest data package was issued, and will then ask the user if they would like to download the newest version instead of the requested version. In the default mode (force = FALSE), `get_data_packages()` will warn a user if the reference ID supplied was not found (does not exist or requires VPN) and if a reference ID refers to a product that is not a data package, `get_data_packages()` will ask if the user wants to attempt to download it anyway.
#'
#' Although `get_data_packages()` is designed to handle the data package reference type on DataStore, it should in theory work on any reference type and download all files associated with a reference (it ignores web links/web resources associated with a reference). If the reference includes a .zip file, the file will be downloaded and the contents extracted. The original .zip archive file will then be deleted. If the .zip contains files with the same name as files in the parent directory, the parent directory files will be over-written by the contents of the .zip archive.
#'
Expand Down Expand Up @@ -52,9 +52,7 @@ get_data_packages <- function(reference_id,
#check for newer version:
if(force == FALSE){
cat("Working on: ", crayon::bold$green(reference_id[i]), ".\n", sep="")
url <- paste0(
"https://irmaservices.nps.gov/datastore-secure/v5/rest/ReferenceCodeSearch?q=",
reference_id[i])
url <- paste0(.ds_secure_api(), "ReferenceCodeSearch?q=", reference_id[i])
#api call to see if ref exists
test_req <- httr::GET(url, httr::authenticate(":", ":", "ntlm"))
status_code <- httr::stop_for_status(test_req)$status_code
Expand Down Expand Up @@ -89,8 +87,9 @@ get_data_packages <- function(reference_id,
#check for a newer version:
version <-ref_data$mostRecentVersion
if(!is.na(version)){
newest_url <- paste0(
"https://irmaservices.nps.gov/datastore-secure/v5/rest/ReferenceCodeSearch?q=", version)
newest_url <- paste0(.ds_secure_api(),
"ReferenceCodeSearch?q=",
version)
new_req <- httr::GET(newest_url, httr::authenticate(":", ":", "ntlm"))
new_status <- httr::stop_for_status(new_req)$status_code
if(!new_status == 200){
Expand Down Expand Up @@ -142,9 +141,10 @@ get_data_packages <- function(reference_id,
}

#get HoldingID from the ReferenceID - defaults to the first holding
rest_holding_info_url <- paste0(
"https://irmaservices.nps.gov/datastore-secure/v4/rest/reference/",
reference_id[i], "/DigitalFiles")
rest_holding_info_url <- paste0(.ds_secure_api(),
"reference/",
reference_id[i],
"/DigitalFiles")
xml <- suppressMessages(httr::content(httr::GET(rest_holding_info_url,
httr::authenticate(":", ":", "ntlm"))))

Expand All @@ -156,9 +156,9 @@ get_data_packages <- function(reference_id,
for(j in seq_along(xml)){
#get file URL
tryCatch(
{rest_download_url <- paste0(
"https://irmaservices.nps.gov/datastore-secure",
"/v4/rest/DownloadFile/",xml[[j]]$resourceId)},
{rest_download_url <- paste0(.ds_secure_api(),
"DownloadFile/",
xml[[j]]$resourceId)},
error = function(e){
cat(crayon::red$bold(
"ERROR: You do not have permissions to access ",
Expand Down Expand Up @@ -234,9 +234,9 @@ get_data_packages <- function(reference_id,
#check for newer version:
if(force == FALSE){
cat("Working on: ", crayon::bold$green(reference_id[i]), ".\n", sep="")
url <- paste0(
"https://irmaservices.nps.gov/datastore/v5/rest/ReferenceCodeSearch?q=",
reference_id[i])
url <- paste0(.ds_api(),
"ReferenceCodeSearch?q=",
reference_id[i])
#api call to see if ref exists
test_req <- httr::GET(url, httr::authenticate(":", ":", "ntlm"))
status_code <- httr::stop_for_status(test_req)$status_code
Expand Down Expand Up @@ -279,9 +279,9 @@ get_data_packages <- function(reference_id,
#Look for a newer version:
version <-ref_data$mostRecentVersion
if(!is.na(version)){
newest_url <- paste0(
"https://irmaservices.nps.gov/datastore/v5/rest/ReferenceCodeSearch?q=",
version)
newest_url <- paste0(.ds_api(),
"ReferenceCodeSearch?q=",
version)
new_req <- httr::GET(newest_url, httr::authenticate(":", ":", "ntlm"))
new_status <- httr::stop_for_status(new_req)$status_code
if(!new_status == 200){
Expand Down Expand Up @@ -333,9 +333,10 @@ get_data_packages <- function(reference_id,
dir.create(destination_dir)
}
# get the HoldingID from the ReferenceID
rest_holding_info_url <- paste0(
"https://irmaservices.nps.gov/datastore/v4/rest/reference/",
reference_id[i], "/DigitalFiles")
rest_holding_info_url <- paste0(.ds_api(),
"reference/",
reference_id[i],
"/DigitalFiles")
xml <- httr::content(httr::GET(rest_holding_info_url))

#test whether requires secure=TRUE & VPN; alert user:
Expand Down
18 changes: 18 additions & 0 deletions R/utils.R
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
## assign global package variables

#initiate new environment accessible from within package:
.pkgglobalenv <- new.env(parent=emptyenv())

#data_store API base URL:
assign("ds_api", "https://irmaservices.nps.gov/datastore/v6/rest/", envir=.pkgglobalenv)

#data_store secure API base URL:
assign("ds_secure_api", "https://irmaservices.nps.gov/datastore-secure/v6/rest/", envir=.pkgglobalenv)

.ds_api <- function(x){
get("ds_api", envir = .pkgglobalenv)
}

.ds_secure_api <- function(x){
get("ds_secure_api", envir = .pkgglobalenv)
}
3 changes: 2 additions & 1 deletion README.Rmd
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,8 @@ knitr::opts_chunk$set(
[![CodeFactor](https://www.codefactor.io/repository/github/roblbaker/npsutils/badge)](https://www.codefactor.io/repository/github/roblbaker/npsutils)
<!-- badges: end -->

# NPS Data Store Utilities - NPSutils
# NPS Data Store Utilities - NPSutils
##v0.3.0

This package is a collection of functions to acquire metadata and data from the [National Park Service Data Store](https://irma.nps.gov/DataStore/). This is an early version of this package and many features will be added in 2023. Please request enhancements and bug fixes through [Issues](https://github.com/nationalparkservice/NPSutils/issues).

Expand Down
4 changes: 2 additions & 2 deletions docs/404.html

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

4 changes: 2 additions & 2 deletions docs/LICENSE-text.html

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

4 changes: 2 additions & 2 deletions docs/LICENSE.html

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Loading

0 comments on commit 7fc4479

Please sign in to comment.