Skip to content

Commit

Permalink
Update develop-ref after #2846 (#2854)
Browse files Browse the repository at this point in the history
* 2673 Moved dvariable declaration after include

* #2673 Move down namespace below include

* Feature #2395 wdir (#2820)

* Per #2395, add new columns to VL1L2, VAL1L2, and VCNT line types for wind direction statistics. Work still in progress.

* Per #2395, write the new VCNT columns to the output and document the additions to the VL1L2, VAL1L2, and VCNT columns.

* Per #2395, add the definition of new statistics to Appendix G.

* Per #2395, update file version history.

* Per #2395, tweak warning message about zero wind vectors and update grid-stat and point-stat to log calls to the do_vl1l2() function.

* Per #2395, refine the weights for wind direction stats, ignoring the undefined directions.

* Update src/tools/core/stat_analysis/aggr_stat_line.cc

* Update src/tools/core/stat_analysis/parse_stat_line.cc

* Update src/tools/core/stat_analysis/aggr_stat_line.cc

* Recent changes to branch protection rules for the develop branch have broken the logic of the update_truth.yml GHA workflow. Instead of submitting a PR to merge develop into develop-ref directly, use an intermediate update_truth_for_develop branch.

* Feature #2280 ens_prob (#2823)

* Per #2280, update to support probability threshold strings like ==8, where 8 is the number of ensemble members, to create probability bins centered on the n/8 for n = 0 ... 8.

* Per #2280, update docs about probability threshold settings.

* Per #2280, use a loose tolerance when checking for consistent bin widths.

* Per #2280, add a new unit test for grid_stat to demonstrate processing the output from gen_ens_prod.

* Per #2280, when verifying NMEP probability forecasts, smooth the obs data first.

* Per #2280, only request STAT output for the PCT line type to match unit_grid_stat.xml and minimize the new output files.

* Per #2280, update config option docs.

* Per #2280, update config option docs.

* #2673 Change 0 to nullptr

* #2673 Change 0 to nullptr

* #2673 Change 0 to nullptr

* #2673 Change 0 to nullptr

* #2673 Change 0 to nullptr

* #2673 Removed the redundant parentheses with return

* #2673 Removed the redundant parentheses with return

* #2673 Removed the redundant parentheses with return

* #2673 Removed the redundant parentheses with return

* #2673 Removed the redundant parentheses with return

* #2673 restored return statement

* #2673 Added std namespace

* #2673 Moved down 'using namespace' statement. Removed trailing spaces

* #2673 Moved down 'using namespace' statement.

* #2673 Moved down 'using namespace' statement.

* #2673 Moved down 'using namespace' statement.

* #2673 Moved down 'using namespace' statement.

* #2673 Added std namespace

* #2673 Added std namespace

* #2673 Added std namespace

* #2673 Changed literal 1 to boolean value, true

* Feature #2673 enum_to_string (#2835)

* Feature #2583 ecnt (#2825)

* Unrelated to #2583, fix typo in code comments.

* Per #2583, add hooks write 3 new ECNT columns for observation error data.

* Per #2583, make error messages about mis-matched array lengths more informative.

* Per #2583, switch to more concise variable naming conventions of ign_oerr_cnv, ign_oerr_cor, and dawid_seb.

* Per #2583, fix typo to enable compilation

* Per #2583, define the 5 new ECNT column names.

* Per #2583, add 5 new columns to the ECNT table in the Ensemble-Stat chapter

* Per #2583, update stat_columns.cc to write these 5 new ECNT columns

* Per #2583, update ECNTInfo class to compute the 5 new ECNT statistics.

* Per #2583, update stat-analysis to parse the 5 new ECNT columns.

* Per #2583, update aggregate_stat logic for 5 new ECNT columns.

* Per #2583, update PairDataEnsemble logic for 5 new ECNT columns

* Per #2583, update vx_statistics library with obs_error handling logic for the 5 new ECNT columns

* Per #2583, changes to make it compile

* Per #2583, changes to make it compile

* Per #2583, switch to a consistent ECNT column naming convention with OERR at the end. Using IGN_CONV_OERR and IGN_CORR_OERR.

* Per #2583, define ObsErrorEntry::variance() with a call to the dist_var() utility function.

* Per #2583, update PairDataEnsemble::compute_pair_vals() to compute the 5 new stats with the correct inputs.

* Per #2583, add DEBUG(10) log messages about computing these new stats.

* Per #2583, update Stat-Analysis to compute these 5 new stats from the ORANK line type.

* Per #2583, whitespace and comments.

* Per #2583, update the User's Guide.

* Per #2583, remove the DS_ADD_OERR and DS_MULT_OERR ECNT columns and rename DS_OERR as DSS, since observation error is not actually involved in its computation.

* Per #2583, minor update to Appendix C

* Per #2583, rename ECNT line type statistic DSS to IDSS.

* Per #2583, fix a couple of typos

* Per #2583, more error checking.

* Per #2583, remove the ECNT IDSS column since its just 2*pi*IGN, the existing ignorance score, and only provides meaningful information when combined with the other Dawid-Sebastiani statistics that have already been removed.

* Per #2583, add Eric's documentation of these new stats to Appendix C. Along the way, update the DOI links in the references based on this APA style guide: https://apastyle.apa.org/style-grammar-guidelines/references/dois-urls#:~:text=Include%20a%20DOI%20for%20all,URL%2C%20include%20only%20the%20DOI.

* Per #2583, fix new equations with embedded underscores for PDF by defining both html and pdf formatting options.

* Per #2583, update the ign_conv_oerr equation to include a 2
*pi multiplier for consistency with the existing ignorance score. Also, fix the documented equations.

* Per #2583, remove log file that was inadvertently added on this branch.

* Per #2583, simplify ObsErrorEntry::variance() implementation. For the distribution type of NONE, return a variance of 0.0 rather than bad data, as discussed with @michelleharrold and @JeffBeck-NOAA on 3/8/2024.

---------

Co-authored-by: MET Tools Test Account <met_test@seneca.rap.ucar.edu>

* Revert #2825 since more documentation and testing is needed (#2837)

This reverts commit 108a895.

* Feature #2583 ecnt fix IGN_OERR_CORR (#2838)

* Unrelated to #2583, fix typo in code comments.

* Per #2583, add hooks write 3 new ECNT columns for observation error data.

* Per #2583, make error messages about mis-matched array lengths more informative.

* Per #2583, switch to more concise variable naming conventions of ign_oerr_cnv, ign_oerr_cor, and dawid_seb.

* Per #2583, fix typo to enable compilation

* Per #2583, define the 5 new ECNT column names.

* Per #2583, add 5 new columns to the ECNT table in the Ensemble-Stat chapter

* Per #2583, update stat_columns.cc to write these 5 new ECNT columns

* Per #2583, update ECNTInfo class to compute the 5 new ECNT statistics.

* Per #2583, update stat-analysis to parse the 5 new ECNT columns.

* Per #2583, update aggregate_stat logic for 5 new ECNT columns.

* Per #2583, update PairDataEnsemble logic for 5 new ECNT columns

* Per #2583, update vx_statistics library with obs_error handling logic for the 5 new ECNT columns

* Per #2583, changes to make it compile

* Per #2583, changes to make it compile

* Per #2583, switch to a consistent ECNT column naming convention with OERR at the end. Using IGN_CONV_OERR and IGN_CORR_OERR.

* Per #2583, define ObsErrorEntry::variance() with a call to the dist_var() utility function.

* Per #2583, update PairDataEnsemble::compute_pair_vals() to compute the 5 new stats with the correct inputs.

* Per #2583, add DEBUG(10) log messages about computing these new stats.

* Per #2583, update Stat-Analysis to compute these 5 new stats from the ORANK line type.

* Per #2583, whitespace and comments.

* Per #2583, update the User's Guide.

* Per #2583, remove the DS_ADD_OERR and DS_MULT_OERR ECNT columns and rename DS_OERR as DSS, since observation error is not actually involved in its computation.

* Per #2583, minor update to Appendix C

* Per #2583, rename ECNT line type statistic DSS to IDSS.

* Per #2583, fix a couple of typos

* Per #2583, more error checking.

* Per #2583, remove the ECNT IDSS column since its just 2*pi*IGN, the existing ignorance score, and only provides meaningful information when combined with the other Dawid-Sebastiani statistics that have already been removed.

* Per #2583, add Eric's documentation of these new stats to Appendix C. Along the way, update the DOI links in the references based on this APA style guide: https://apastyle.apa.org/style-grammar-guidelines/references/dois-urls#:~:text=Include%20a%20DOI%20for%20all,URL%2C%20include%20only%20the%20DOI.

* Per #2583, fix new equations with embedded underscores for PDF by defining both html and pdf formatting options.

* Per #2583, update the ign_conv_oerr equation to include a 2
*pi multiplier for consistency with the existing ignorance score. Also, fix the documented equations.

* Per #2583, remove log file that was inadvertently added on this branch.

* Per #2583, simplify ObsErrorEntry::variance() implementation. For the distribution type of NONE, return a variance of 0.0 rather than bad data, as discussed with @michelleharrold and @JeffBeck-NOAA on 3/8/2024.

* Per #2583, updates to ensemble-stat.rst recommended by @michelleharrold and @JeffBeck-NOAA.

* Per #2583, implement changes to the IGN_CORR_OERR corrected as directed by @ericgilleland.

---------

Co-authored-by: MET Tools Test Account <met_test@seneca.rap.ucar.edu>

* Update the pull request template to include a question about expected impacts to existing METplus Use Cases.

* #2830 Changed enum Builtin to enum class

* #2830 Converted enum to enum class at config_constants.h

* Feature #2830 bootstrap enum (#2843)

* Bugfix #2833 develop azimuth (#2840)

* Per #2833, fix n-1 bug when defining the azimuth delta for range/azimuth grids.

* Per #2833, when definng TcrmwData:range_max_km, divide by n_range - 1 since the range values start at 0.

* Per #2833, remove max_range_km from the TC-RMW config file. Set the default rmw_scale to NA so that its not used by default. And update the documentation. Still actually need to make the logic of the code work as it should.

* Per #2833, update tc_rmw to define the range as either a function of rmw or using explicit spacing in km.

* Per #2833, update the TCRMW Config files to remove the max_range_km entry, and update the unit test for one call to use RMW ranges and the other to use ranges defined in kilometers.

* Per #2833, just correct code comments.

* Per #2833, divide by n - 1 when computing the range delta, rather than n.

* Per #2833, correct the handling of the maximum range in the tc-rmw tool. For fixed delta km, need to define the max range when setting up the grid at the beginning.

---------

Co-authored-by: MET Tools Test Account <met_test@seneca.rap.ucar.edu>

* #2830 Changed enum PadSize to enum class

* #2830 Removed redundant parantheses

* #2830 Removed commenyted out code

* #2830 Use auto

* #2830 Changed enum to enum class for DistType, InterpMthd, GridTemplates, and NormalizeType

* #2830 Moved enum_class_as_integer from header file to cc files

* #2830 Added enum_as_int.hpp

* #2830 Added enum_as_int.hpp

* Deleted enum_class_as_integer and renamed it to enum_class_as_int

* Removed redundant paranthese

* #2830 Changed enum to enumclass

* #2830 Changed enum_class_as_integer to enum_class_as_int

* Feature #2379 sonarqube gha (#2847)

* Per #2379, testing initial GHA SonarQube setup.

* Per #2379, switch to only analyzing the src directory.

* Per #2379, move more config logic from sonar-project.properties into the workflow. #ci-skip-all

* Per #2379, try removing + symbols

* Per #2379, move projectKey into xml workflow and remove sonar-project.properties.

* Per #2379, try following the instructions at https://github.com/sonarsource-cfamily-examples/linux-autotools-gh-actions-sq/blob/main/.github/workflows/build.yml ci-skip-all

* Per #2379, see details of progress described in this issue comment: #2379 (comment)

* Unrelated to #2379, just removing spurious space that gets flagged as a diff when re-running enum_to_string on seneca.

* Per #2379, try running SonarQube through GitHub.

* Per #2379, remove empty env section and also disable the testing workflow temporarily during sonarqube development.

* Per #2379, fix docker image name.

* Per #2379, delete unneeded script.

* Per #2379, update GHA to scan Python code and push to the correct SonarQube projects.

* Per #2379, update GHA SonarQube project names

* Per #2379, update the build job name

* Per #2379, update the comile step name

* Per #2379, switch to consistent SONAR variable names.

* Per #2379, fix type in sed expressions.

* Per #2379, just rename the log artifact

* Per #2379, use time_command wrapper instead of run_command.

* Per #2379, fix bad env var name

* Per #2379, switch from egrep to grep.

* Per #2379, just try cat-ting the logfile

* Per #2379, test whether cat-ting the log file actually works.

* Per #2379, revert back

* Per #2379, mention SonarQube in the PR template. Make workflow name more succinct.

* Per #2379, add SONAR_REFERENCE_BRANCH setting to define the sonar.newCode.referenceBranch property. The goal is to define the comparison reference branch for each SonarQube scan.

* Per #2379, have the sonarqube.yml job print the reference branch it's using

* Per #2379, intentionally introduce a new code smell to see if SonarQube correctly flag it as appearing in new code.

* Per #2379, trying adding the SonarQube quality gate check.

* Per #2379, add logic for using the report-task.txt output files to check the quality gate status for both the python and cxx scans.

* Per #2379 must use unique GHA id's

* Per #2379, working on syntax for quality gate checks

* Per #2379, try again.

* Per #2379, try again

* Per #2379, try again

* Per #2379, try again

* Per #2379, try again

* Per #2379, try again

* Per #2379, try yet again

* Per #2379

* Per #2379, add more debug

* Per #2379, remove -it option from docker run commands

* Per #2379, again

* Per #2379, now that the scan works as expected, remove the intentional SonarQube code smell as well as debug logging.

* Hotfix related to #2379. The sonar.newCode.referenceBranch and sonar.branch.name cannot be set to the same string! Only add the newCode definition when they differ.

* #2830 Changed enum STATJobType to enum class

* #2830 Changed STATLineType to enum class

* #2830 Changed Action to enum class

* #2830 Changed ModeDataType to enum class

* #2830 Changed StepCase to enum class

* #2830 Changed enum to enum class

* #2830 Changed GenesisPairCategory to enum class

* #2830 Removed rediundabt parenrthese

* #2830 Reduced same if checking

* #2830 Cleanup

* #2830 USe empty() instead of lebgth checking

* #2830 Adjusted indentations

* Feature #2379 develop sonarqube updates (#2850)

* Per #2379, move rgb2ctable.py into the python utility scripts directory for better organization and to enable convenient SonarQube scanning.

* Per #2379, remove point.py from the vx_python3_utils directory which cleary was inadvertenlty added during development 4 years ago. As far as I can tell it isn't being called by any other code and doesn't belong in the repository. Note that scripts/python/met/point.py has the same name but is entirely different.

* Per #2379, update the GHA SonarQube scan to do a single one with Python and C++ combined. The nightly build script is still doing 2 separate scans for now. If this all works well, they could also be combined into a single one.

* Per #2379, eliminate MET_CONFIG_OPTIONS from the SonarQube workflow since it doesn't need to be and probably shouldn't be configurable.

* Per #2379, trying to copy report-task.txt out of the image

* Per #2379, update build_met_sonarqube.sh to check the scan return status

* Per #2379, fix bash assignment syntax

* Per #2379, remove unused SCRIPT_DIR envvar

* Per #2379, switch to a single SonarQube scan for MET's nightly build as well

* Feature 2654 ascii2nc polar buoy support (#2846)

* Added iabp data type, and modified file_handler to filter based on time range, which was added as a command line option

* handle time using input year, hour, min, and doy

* cleanup and switch to position day of year for time computations

* Added an ascii2nc unit test for iabp data

* Added utility scripts to pull iabp data from the web and find files in a time range

* Modified iabp_handler to always output a placeholder 'location' observation with value 1

* added description of IABP data python utility scripts

* Fixed syntax error

* Fixed Another syntax error.

* Slight reformat of documentation

* Per #2654, update the Makefiles in scripts/python/utility to include all the python scripts that should be installed.

* Per #2654, remove unused code from get_iabp_from_web.py that is getting flagged as a bug by SonarQube.

* Per #2654, fix typo in docs

---------

Co-authored-by: John Halley Gotway <johnhg@ucar.edu>
Co-authored-by: MET Tools Test Account <met_test@seneca.rap.ucar.edu>

---------

Co-authored-by: Howard Soh <hsoh@seneca.rap.ucar.edu>
Co-authored-by: John Halley Gotway <johnhg@ucar.edu>
Co-authored-by: Howard Soh <hsoh@ucar.edu>
Co-authored-by: MET Tools Test Account <met_test@seneca.rap.ucar.edu>
Co-authored-by: davidalbo <dave@ucar.edu>
Co-authored-by: metplus-bot <97135045+metplus-bot@users.noreply.github.com>
  • Loading branch information
7 people committed Apr 10, 2024
1 parent f787c00 commit b5c883c
Show file tree
Hide file tree
Showing 152 changed files with 4,435 additions and 2,978 deletions.
2 changes: 1 addition & 1 deletion .github/jobs/build_docker_image.sh
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ time_command docker build -t ${DOCKERHUB_TAG} \
--build-arg MET_CONFIG_OPTS \
-f $DOCKERFILE_PATH ${GITHUB_WORKSPACE}
if [ $? != 0 ]; then
cat ${GITHUB_WORKSPACE}/docker_build.log
cat ${CMD_LOGFILE}
exit 1
fi

Expand Down
45 changes: 45 additions & 0 deletions .github/jobs/build_sonarqube_image.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,45 @@
#! /bin/bash

source ${GITHUB_WORKSPACE}/.github/jobs/bash_functions.sh

DOCKERHUB_TAG=met-sonarqube-gha

DOCKERFILE_PATH=${GITHUB_WORKSPACE}/internal/scripts/docker/Dockerfile.sonarqube

CMD_LOGFILE=${GITHUB_WORKSPACE}/sonarqube_build.log

#
# Define the $SONAR_REFERENCE_BRANCH as the
# - Target of any requests
# - Manual setting for workflow dispatch
# - Source branch for any pushes (e.g. develop)
#
if [ "${GITHUB_EVENT_NAME}" == "pull_request" ]; then
export SONAR_REFERENCE_BRANCH=${GITHUB_BASE_REF}
elif [ "${GITHUB_EVENT_NAME}" == "workflow_dispatch" ]; then
export SONAR_REFERENCE_BRANCH=${WD_REFERENCE_BRANCH}
else
export SONAR_REFERENCE_BRANCH=${SOURCE_BRANCH}
fi

echo SONAR_REFERENCE_BRANCH=${SONAR_REFERENCE_BRANCH}

time_command docker build -t ${DOCKERHUB_TAG} \
--build-arg MET_BASE_REPO \
--build-arg MET_BASE_TAG \
--build-arg SOURCE_BRANCH \
--build-arg SONAR_SCANNER_VERSION \
--build-arg SONAR_HOST_URL \
--build-arg SONAR_TOKEN \
--build-arg SONAR_REFERENCE_BRANCH \
-f $DOCKERFILE_PATH ${GITHUB_WORKSPACE}
if [ $? != 0 ]; then
cat ${CMD_LOGFILE}
exit 1
fi

# Copy the .scannerwork directory from the image
id=$(docker create ${DOCKERHUB_TAG})
time_command mkdir -p /tmp/scannerwork
time_command docker cp $id:/met/.scannerwork/report-task.txt /tmp/scannerwork/report-task.txt
docker rm -v $id
3 changes: 3 additions & 0 deletions .github/pull_request_template.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,9 @@ If **yes**, describe the new output and/or changes to the existing output:</br>
- [ ] Will this PR result in changes to existing METplus Use Cases? **[Yes or No]**</br>
If **yes**, create a new **Update Truth** [METplus issue](https://github.com/dtcenter/METplus/issues/new/choose) to describe them.

- [ ] Do these changes introduce new SonarQube findings? **[Yes or No]**</br>
If **yes**, please describe:

- [ ] Please complete this pull request review by **[Fill in date]**.</br>

## Pull Request Checklist ##
Expand Down
91 changes: 91 additions & 0 deletions .github/workflows/sonarqube.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,91 @@
name: SonarQube Scan

# Run SonarQube for Pull Requests and changes to the develop and main_vX.Y branches

on:

# Trigger analysis for pushes to develop and main_vX.Y branches
push:
branches:
- develop
- 'main_v**'
paths-ignore:
- 'docs/**'
- '.github/pull_request_template.md'
- '.github/ISSUE_TEMPLATE/**'
- '.github/labels/**'
- '**/README.md'
- '**/LICENSE.md'

# Trigger analysis for pull requests to develop and main_vX.Y branches
pull_request:
types: [opened, synchronize, reopened]
branches:
- develop
- 'main_v**'
paths-ignore:
- 'docs/**'
- '.github/pull_request_template.md'
- '.github/ISSUE_TEMPLATE/**'
- '.github/labels/**'
- '**/README.md'
- '**/LICENSE.md'

workflow_dispatch:
inputs:
reference_branch:
description: 'Reference Branch'
default: develop
type: string

jobs:
build:
name: SonarQube Scan
runs-on: ubuntu-latest

steps:

- uses: actions/checkout@v4
with:
# Disable shallow clones for better analysis
fetch-depth: 0

- name: Create output directories
run: mkdir -p ${RUNNER_WORKSPACE}/logs

- name: Get branch name
id: get_branch_name
run: echo branch_name=${GITHUB_REF#refs/heads/} >> $GITHUB_OUTPUT

- name: SonarQube Scan in Docker
run: .github/jobs/build_sonarqube_image.sh
env:
MET_BASE_REPO: met-base
MET_BASE_TAG: v3.2
SOURCE_BRANCH: ${{ steps.get_branch_name.outputs.branch_name }}
WD_REFERENCE_BRANCH: ${{ github.event.inputs.reference_branch }}
SONAR_SCANNER_VERSION: 5.0.1.3006
SONAR_HOST_URL: ${{ secrets.SONAR_HOST_URL }}
SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}

- name: SonarQube Quality Gate check
id: sonarqube-quality-gate-check
uses: sonarsource/sonarqube-quality-gate-action@master
with:
scanMetadataReportFile: /tmp/scannerwork/report-task.txt
timeout-minutes: 5
env:
SONAR_HOST_URL: ${{ secrets.SONAR_HOST_URL }}
SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}

- name: Copy log files into logs directory
if: always()
run: cp ${GITHUB_WORKSPACE}/*.log ${RUNNER_WORKSPACE}/logs/

- name: Upload logs as artifact
if: always()
uses: actions/upload-artifact@v4
with:
name: logs_sonarqube
path: ${{ runner.workspace }}/logs
if-no-files-found: ignore
53 changes: 46 additions & 7 deletions docs/Users_Guide/reformat_point.rst
Original file line number Diff line number Diff line change
Expand Up @@ -458,6 +458,8 @@ While initial versions of the ASCII2NC tool only supported a simple 11 column AS

• `International Soil Moisture Network (ISMN) Data format <https://ismn.bafg.de/en/>`_.

• `International Arctic Buoy Programme (IABP) Data format <https://iabp.apl.uw.edu/>`_.

• `AErosol RObotic NEtwork (AERONET) versions 2 and 3 format <http://aeronet.gsfc.nasa.gov/>`_

• Python embedding of point observations, as described in :numref:`pyembed-point-obs-data`. See example below in :numref:`ascii2nc-pyembed`.
Expand Down Expand Up @@ -522,6 +524,8 @@ Once the ASCII point observations have been formatted as expected, the ASCII fil
netcdf_file
[-format ASCII_format]
[-config file]
[-valid_beg time]
[-valid_end time]
[-mask_grid string]
[-mask_poly file]
[-mask_sid file|list]
Expand All @@ -541,21 +545,25 @@ Required Arguments for ascii2nc
Optional Arguments for ascii2nc
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

3. The **-format ASCII_format** option may be set to "met_point", "little_r", "surfrad", "wwsis", "airnowhourlyaqobs", "airnowhourly", "airnowdaily_v2", "ndbc_standard", "ismn", "aeronet", "aeronetv2", "aeronetv3", or "python". If passing in ISIS data, use the "surfrad" format flag.
3. The **-format ASCII_format** option may be set to "met_point", "little_r", "surfrad", "wwsis", "airnowhourlyaqobs", "airnowhourly", "airnowdaily_v2", "ndbc_standard", "ismn", "iabp", "aeronet", "aeronetv2", "aeronetv3", or "python". If passing in ISIS data, use the "surfrad" format flag.

4. The **-config file** option is the configuration file for generating time summaries.

5. The **-mask_grid** string option is a named grid or a gridded data file to filter the point observations spatially.
5. The **-valid_beg** time option in YYYYMMDD[_HH[MMSS]] format sets the beginning of the retention time window.

6. The **-mask_poly** file option is a polyline masking file to filter the point observations spatially.
6. The **-valid_end** time option in YYYYMMDD[_HH[MMSS]] format sets the end of the retention time window.

7. The **-mask_sid** file|list option is a station ID masking file or a comma-separated list of station ID's to filter the point observations spatially. See the description of the "sid" entry in :numref:`config_options`.
7. The **-mask_grid** string option is a named grid or a gridded data file to filter the point observations spatially.

8. The **-log file** option directs output and errors to the specified log file. All messages will be written to that file as well as standard out and error. Thus, users can save the messages without having to redirect the output on the command line. The default behavior is no log file.
8. The **-mask_poly** file option is a polyline masking file to filter the point observations spatially.

9. The **-v level** option indicates the desired level of verbosity. The value of "level" will override the default setting of 2. Setting the verbosity to 0 will make the tool run with no log messages, while increasing the verbosity above 1 will increase the amount of logging.
9. The **-mask_sid** file|list option is a station ID masking file or a comma-separated list of station ID's to filter the point observations spatially. See the description of the "sid" entry in :numref:`config_options`.

10. The **-compress level** option indicates the desired level of compression (deflate level) for NetCDF variables. The valid level is between 0 and 9. The value of "level" will override the default setting of 0 from the configuration file or the environment variable MET_NC_COMPRESS. Setting the compression level to 0 will make no compression for the NetCDF output. Lower number is for fast compression and higher number is for better compression.
10. The **-log file** option directs output and errors to the specified log file. All messages will be written to that file as well as standard out and error. Thus, users can save the messages without having to redirect the output on the command line. The default behavior is no log file.

11. The **-v level** option indicates the desired level of verbosity. The value of "level" will override the default setting of 2. Setting the verbosity to 0 will make the tool run with no log messages, while increasing the verbosity above 1 will increase the amount of logging.

12. The **-compress level** option indicates the desired level of compression (deflate level) for NetCDF variables. The valid level is between 0 and 9. The value of "level" will override the default setting of 0 from the configuration file or the environment variable MET_NC_COMPRESS. Setting the compression level to 0 will make no compression for the NetCDF output. Lower number is for fast compression and higher number is for better compression.

An example of the ascii2nc calling sequence is shown below:

Expand Down Expand Up @@ -1203,3 +1211,34 @@ For how to use the script, issue the command:
.. code-block:: none
python3 MET_BASE/python/utility/print_pointnc2ascii.py -h
IABP retrieval Python Utilities
====================================

`International Arctic Buoy Programme (IABP) Data <https://iabp.apl.uw.edu/>`_ is one of the data types supported by ascii2nc. A utility script that pulls all this data from the web and stores it locally, called get_iabp_from_web.py is included. This script accesses the appropriate webpage and downloads the ascii files for all buoys. It is straightforward, but can be time intensive as the archive of this data is extensive and files are downloaded one at a time.

The script can be found at:

.. code-block:: none
MET_BASE/python/utility/get_iabp_from_web.py
For how to use the script, issue the command:

.. code-block:: none
python3 MET_BASE/python/utility/get_iabp_from_web.py -h
Another IABP utility script is included for users, to be run after all files have been downloaded using get_iabp_from_web.py. This script examines all the files and lists those files that contain entries that fall within a user specified range of days. It is called find_iabp_in_timerange.py.

The script can be found at:

.. code-block:: none
MET_BASE/python/utility/find_iabp_in_timerange.py
For how to use the script, issue the command:

.. code-block:: none
python3 MET_BASE/python/utility/find_iabp_in_timerange.py -h
94 changes: 94 additions & 0 deletions internal/scripts/docker/Dockerfile.sonarqube
Original file line number Diff line number Diff line change
@@ -0,0 +1,94 @@
ARG MET_BASE_REPO=met-base
ARG MET_BASE_TAG=v3.2

FROM dtcenter/${MET_BASE_REPO}:${MET_BASE_TAG}
MAINTAINER John Halley Gotway <johnhg@ucar.edu>

#
# This Dockerfile checks out MET from GitHub and runs the
# SonarQube static code analysis on the specified branch or tag.
# https://docs.sonarqube.org/latest/analysis/scan/sonarscanner/
#
ARG SONAR_SCANNER_VERSION=5.0.1.3006
ARG SONAR_HOST_URL
ARG SONAR_TOKEN
ARG SOURCE_BRANCH
ARG SONAR_REFERENCE_BRANCH

#
# SONAR_HOST_URL is required.
#
RUN if [ "x${SONAR_HOST_URL}" = "x" ]; then \
echo "ERROR: SONAR_HOST_URL undefined! Rebuild with \"--build-arg SONAR_HOST_URL={url}\""; \
exit 1; \
fi

#
# SONAR_TOKEN is required.
#
RUN if [ "x${SONAR_TOKEN}" = "x" ]; then \
echo "ERROR: SONAR_TOKEN undefined! Rebuild with \"--build-arg SONAR_TOKEN={token}\""; \
exit 1; \
fi

#
# SOURCE_BRANCH is the branch name of the MET source code.
#
RUN if [ "x${SOURCE_BRANCH}" = "x" ]; then \
echo "ERROR: SOURCE_BRANCH undefined! Rebuild with \"--build-arg SOURCE_BRANCH={branch name}\""; \
exit 1; \
else \
echo "Build Argument SOURCE_BRANCH=${SOURCE_BRANCH}"; \
fi

#
# SONAR_REFERENCE_BRANCH defines to the version against which this scan should be compared.
#
RUN if [ "x${SONAR_REFERENCE_BRANCH}" = "x" ]; then \
echo "ERROR: SONAR_REFERENCE_BRANCH undefined! Rebuild with \"--build-arg SONAR_REFERENCE_BRANCH={branch name}\""; \
exit 1; \
else \
echo "Build Argument SONAR_REFERENCE_BRANCH=${SONAR_REFERENCE_BRANCH}"; \
fi

ENV MET_GIT_NAME ${SOURCE_BRANCH}
ENV MET_REPO_DIR /met/MET-${MET_GIT_NAME}
ENV MET_GIT_URL https://github.com/dtcenter/MET

#
# Download and install the Sonar software.
#
RUN echo "Installing SonarQube into $HOME/.sonar" \
&& mkdir -p $HOME/.sonar \
&& curl -sSLo $HOME/.sonar/sonar-scanner.zip https://binaries.sonarsource.com/Distribution/sonar-scanner-cli/sonar-scanner-cli-${SONAR_SCANNER_VERSION}-linux.zip \
&& unzip -o $HOME/.sonar/sonar-scanner.zip -d $HOME/.sonar/ \
&& echo export PATH="$HOME/.sonar/sonar-scanner-${SONAR_SCANNER_VERSION}-linux/bin:\$PATH" >> $HOME/.bashrc \
&& curl -sSLo $HOME/.sonar/build-wrapper-linux-x86.zip ${SONAR_HOST_URL}/static/cpp/build-wrapper-linux-x86.zip \
&& unzip -o $HOME/.sonar/build-wrapper-linux-x86.zip -d $HOME/.sonar/ \
&& echo export PATH="$HOME/.sonar/build-wrapper-linux-x86:\$PATH" >> $HOME/.bashrc

#
# Update the OS, as needed.
#
RUN apt update

#
# Set the working directory.
#
WORKDIR /met

#
# Copy MET Download and install MET.
#
RUN echo "Copying MET into ${MET_REPO_DIR}" \
&& mkdir -p ${MET_REPO_DIR}

COPY . ${MET_REPO_DIR}

RUN if [ ! -e "${MET_REPO_DIR}/configure.ac" ]; then \
echo "ERROR: docker build must be run from the MET directory: `ls`"; \
exit 1; \
fi

RUN cd ${MET_REPO_DIR} \
&& internal/scripts/docker/build_met_sonarqube.sh
Loading

0 comments on commit b5c883c

Please sign in to comment.