Skip to content

Commit

Permalink
Add bin/json-to-envvars and bin/yaml-to-envvars
Browse files Browse the repository at this point in the history
Modifies the "Set environment variables" step from the pathogen-repo-ci
workflow¹ into two separate scripts since we will use the same method to
set environment variables to other reusable workflows.

I decided to create `json-to-envvars` as a separate script that
`yaml-to-envvars` calls because I will be using it to set secrets as
environment variables as well. The `--varnames` option will allow us to
be explicit about which secrets we want to set as environment variables.

¹ https://github.com/nextstrain/.github/blob/cc6f4385a45bd6ed114ab4840416fd90cc46cd1b/.github/workflows/pathogen-repo-ci.yaml#L183-L198
  • Loading branch information
joverlee521 committed May 25, 2023
1 parent c3b0431 commit 936de61
Show file tree
Hide file tree
Showing 3 changed files with 97 additions and 0 deletions.
2 changes: 2 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -96,3 +96,5 @@ See also GitHub's [documentation on starter workflows](https://docs.github.com/e
Executable scripts that are used in our workflows.

- [write-envdir](bin/write-envdir)
- [json-to-envvars](bin/json-to-envvars)
- [yaml-to-envvars](bin/yaml-to-envvars)
55 changes: 55 additions & 0 deletions bin/json-to-envvars
Original file line number Diff line number Diff line change
@@ -0,0 +1,55 @@
#!/bin/bash
# usage: json-to-envvars [--varnames=<varnames>]
#
# Transforms JSON object from stdin like this:
#
# {
# "ENV1": "ABC",
# "ENV2": "DEF"
# }
#
# into text like this:
#
# ENV1<<__EOF__
# ABC
# __EOF__
# ENV2<<__EOF__
# DEF
# __EOF__
#
# which is suitable for appending to the $GITHUB_ENV file in order to set
# the environment variables for subsequent steps.
#
# Only specified <varnames> will be included in the output.
# If no <varnames> are provided, all key/value pairs will be included.
# <varnames> should be a string of variable names separated a space, e.g. "ENV1 ENV2"
#
# See the GitHub docs for more info on this heredoc-like syntax, which is
# used here to avoid quoting issues in arbitrary env var values:
# https://docs.github.com/en/actions/using-workflows/workflow-commands-for-github-actions#multiline-strings
#
# Modified from the pathogne-repo-ci workflow:
# https://github.com/nextstrain/.github/blob/cc6f4385a45bd6ed114ab4840416fd90cc46cd1b/.github/workflows/pathogen-repo-ci.yaml#L145-L196
#
set -eou pipefail

varnames=""

for arg; do
case "$arg" in
--varnames=*)
varnames="${arg#*=}"
shift;;
*)
break;;
esac
done

varnames=$(jq --raw-input 'split(" ")' <<< "$varnames")

jq --raw-output --argjson varnames "$varnames" '
to_entries
| if ($varnames | length) > 0 then map(select(.key|IN($varnames[]))) else . end
| map("\(.key)<<__EOF__\n\(.value)\n__EOF__")
| join("\n")
'
40 changes: 40 additions & 0 deletions bin/yaml-to-envvars
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
#!/bin/bash
#
# Transforms YAML from stdin like this:
#
# FOO: bar
# I_CANT_BELIEVE: "it's not YAML"
# would_you_believe: |
# it's
# not
# yaml
#
# first into the equivalent JSON (with yq) and then into text (with jq)
# like this:
#
# FOO<<__EOF__
# bar
# __EOF__
# I_CANT_BELIEVE<<__EOF__
# it's not YAML
# __EOF__
# would_you_believe<<__EOF__
# it's
# not
# yaml
# __EOF__
#
# which is suitable for appending to the $GITHUB_ENV file in order to set
# the environment variables for subsequent steps.
#
# See the GitHub docs for more info on this heredoc-like syntax, which is
# used here to avoid quoting issues in arbitrary env var values:
# https://docs.github.com/en/actions/using-workflows/workflow-commands-for-github-actions#multiline-strings
#
# Modified from the pathogne-repo-ci workflow:
# https://github.com/nextstrain/.github/blob/cc6f4385a45bd6ed114ab4840416fd90cc46cd1b/.github/workflows/pathogen-repo-ci.yaml#L145-L196
#
set -eou pipefail
bin="$(dirname "$0")"

yq --output-format json . | "$bin/json-to-envvars"

0 comments on commit 936de61

Please sign in to comment.