Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Some task stanzas do not allow for interpolated HCL2 variables #10375

Closed
picatz opened this issue Apr 13, 2021 · 3 comments
Closed

Some task stanzas do not allow for interpolated HCL2 variables #10375

picatz opened this issue Apr 13, 2021 · 3 comments
Assignees

Comments

@picatz
Copy link
Contributor

picatz commented Apr 13, 2021

Nomad version

$ nomad version
Nomad v1.0.3 (08741d9f2003ec26e44c72a2c0e27cdf0eadb6ee)

Operating system and Environment details

n/a

Issue

I had written a fairly verbose job file to support a Cockroach DB deployment connected with Consul. I then wanted to better handle the complexity of the maintaining the job file, but I ran into an unexpected issue when using a variable in the Docker task driver's config stanza. It seems that variable is interpolated as the the string "${var.name}" instead of the actual var.name string value, causing the job to fail.

Reproduction steps

Use a variable within a config or env stanza for a task, see that it's not properly interpolated.

Expected Result

HCL2 variables would be interpolated to their actual value.

Actual Result

HCL2 variables within the config stanza (or env stanza) for a task doesn't seem to interpolate correctly.

task "cockroach" {
        env {
          VAR = instance.value
        }
        ...
}
+/- Job: "..."
+/- Task Group: "..." (1 create)
  +/- Task: "cockroach" (forces create/destroy update)
    + Env[VAR]: "${instance.value}"

Job file (if appropriate)

variable "datacenters" {
  type    = list(string)
  default = ["dc1"]
}

variable "instances" {
  type    = number 
  default = 3
}

locals {
  start_port = 26257 
  instances = {for i in range(1, var.instances+1) : format("cockroach-%d", i) => local.start_port + i }
  //
  // { 
  //   "cockroach-1" => 262578,
  //   "cockroach-2" => 262579,
  //   "cockroach-3" => 262580,
  // }
  //
  instances_upstreams = {for i in range(1, var.instances+1) : format("cockroach-%d", i) => { for k, v in local.instances : k => v if k != format("cockroach-%d", i) } }
  //
  // { 
  //   "cockroach-1" => { "cockroach-2" => 262579, "cockroach-3" => 262580 },
  //   "cockroach-2" => { "cockroach-1" => 262578, "cockroach-3" => 262580 },
  //   "cockroach-3" => { "cockroach-1" => 262578, "cockroach-2" => 262579 },
  // }
  //
  instances_args = {for i in range(1, var.instances+1) : format("cockroach-%d", i) => ["start", "--insecure", "--advertise-addr=localhost:262578", "--listen-addr=localhost:262578", "--http-addr=0.0.0.0:8080", "--join=localhost:26258,localhost:26259,localhost:26260", "--logtostderr=WARNING"]}
  //
  // { 
  //   "cockroach-1" => ["start", "--insecure", "--advertise-addr=localhost:262578", "--listen-addr=localhost:262578", "--http-addr=0.0.0.0:8080", "--join=localhost:26258,localhost:26259,localhost:26260", "--logtostderr=WARNING"],
  //   "cockroach-2" => ["start", "--insecure", "--advertise-addr=localhost:262579", "--listen-addr=localhost:262579", "--http-addr=0.0.0.0:8080", "--join=localhost:26258,localhost:26259,localhost:26260", "--logtostderr=WARNING"],
  //   "cockroach-3" => ["start", "--insecure", "--advertise-addr=localhost:262580", "--listen-addr=localhost:262580", "--http-addr=0.0.0.0:8080", "--join=localhost:26258,localhost:26259,localhost:26260", "--logtostderr=WARNING"],
  // }
  //
}
job "cockroach" {
  datacenters = var.datacenters

  type = "service"

  update {
    max_parallel     = 1
    stagger          = "30s"
    min_healthy_time = "30s"
    healthy_deadline = "3m"
  }

  constraint {
    distinct_hosts = true
  }

  dynamic "group" {
    for_each = local.instances
    iterator = instance
    labels = [instance.key]
    content {
      network {
        mode = "bridge"
        port "metrics" {}
      }
      service {
        name = "cockroach-metrics"
        port = "metrics"
        connect {
          sidecar_service {
            proxy {
              expose {
                path {
                  path = "/_status/vars"
	          protocol = "http"
                  listener_port = "metrics"
                  local_path_port = 8080
                }
              }
            }
          }
        }
      }

      service {
        name = "cockroach"
        port = instance.value
        connect {
          sidecar_service {}
        }
      }

      service {
        name = instance.key
        port = instance.value
        connect {
          sidecar_service {
            proxy {
              dynamic "upstreams" {
                for_each = local.instances_upstreams[instance.key]
                content {
                  destination_name = upstreams.key
                  local_bind_port  = upstreams.value
                }
              }
            }
          }
        }
      }

      ephemeral_disk {
        migrate = true
        sticky  = true
        size    = 5000 # 5GB
      }

      task "cockroach" {
        driver = "docker"
        config {
          image = "cockroachdb/cockroach:latest"
          args = [
            "start",
            "--insecure",
            "--advertise-addr=localhost:${instance.value}",
            "--listen-addr=localhost:${instance.value}",
            "--http-addr=0.0.0.0:8080",
            "--join=localhost:26258,localhost:26259,localhost:26260",
            "--logtostderr=WARNING",
          ]
        }
      }
    }
  }
}

☝️ The args within the config stanza for the task doesn't interpolate that way I would expect. The ${instance.value} is treated as just a normal string. I would expect this to be able to be the port value for the iterated local.instances.

I've tried a bunch of different variations including more local variables, dynamic blocks, and there doesn't seem to be a way to properly access these variables from within the config stanza? Is expected behavior?

@notnoop
Copy link
Contributor

notnoop commented Apr 13, 2021

@picatz Can you try again with 1.0.4? I believe this got fixed with #9921 .

@picatz
Copy link
Contributor Author

picatz commented Apr 13, 2021

Thank you @notnoop, that totally fixed it! 😁

@picatz picatz closed this as completed Apr 13, 2021
@github-actions
Copy link

I'm going to lock this issue because it has been closed for 120 days ⏳. This helps our maintainers find and focus on the active issues.
If you have found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Oct 20, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

2 participants