Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Batch nomad variables #16077

Closed
issue-account opened this issue Feb 7, 2023 · 6 comments
Closed

Batch nomad variables #16077

issue-account opened this issue Feb 7, 2023 · 6 comments
Labels
stage/accepted Confirmed, and intend to work on. No timeline committment though. theme/security theme/variables Variables feature type/bug

Comments

@issue-account
Copy link

issue-account commented Feb 7, 2023

Nomad version

1.4.3

Operating system and Environment details

Issue

Acl policy not working for batch job! Job name is created after startup

ERROR:
image
JOB name
image

@jrasell
Copy link
Member

jrasell commented Feb 7, 2023

Hi @Hanmask21 and thanks for raising this issue. I have not been able to reproduce this locally, therefore would you be able to provide more information, such as the job spec and CLI outputs when listing Nomad variables?

I tested using current main and the 1.4.3 release binary.

The variable output showing the variable written to the correct location for access via WI.

$ nomad var get nomad/jobs/example
Namespace   = default
Path        = nomad/jobs/example
Create Time = 2023-02-07T11:45:53Z
Check Index = 12

Items
foo = bar

Job spec being used.

job "example" {
  datacenters = ["dc1"]
  type        = "batch"

  periodic {
    cron             = "*/1 * * * * *"
    prohibit_overlap = true
  }

  group "test" {

    task "busybox" {
      driver = "docker"

      config {
        image   = "busybox"
        command = "sleep"
        args    = ["3000"]
      }
      template {
        data = <<EOH
{{ with nomadVar "nomad/jobs/example" }}{{ .foo }}{{ end }}
EOH
        destination = "local/var.txt"
      }
    }
  }
}

The status of the successfully running periodic job instance.

$ nomad status example/periodic-1675770540
ID            = example/periodic-1675770540
Name          = example/periodic-1675770540
Submit Date   = 2023-02-07T11:49:00Z
Type          = batch
Priority      = 50
Datacenters   = dc1
Namespace     = default
Status        = running
Periodic      = false
Parameterized = false

Summary
Task Group  Queued  Starting  Running  Failed  Complete  Lost  Unknown
test        0       0         1        0       0         0     0

Allocations
ID        Node ID   Task Group  Version  Desired  Status   Created  Modified
8b3b3f80  92ef6b78  test        0        run      running  14s ago  12s ago

The file which contains the variable having been successfully read and written.

$ nomad alloc exec 8b3b3f80 cat local/var.txt
bar

@jrasell jrasell self-assigned this Feb 7, 2023
@issue-account
Copy link
Author

issue-account commented Feb 7, 2023

@jrasell
Create variables nomad/jobs/test1

nomad var get nomad/jobs/test1  
Namespace   = default
Path        = nomad/jobs/test1
Create Time = 2023-02-07T14:58:09+03:00
Check Index = 57194

Items
foo = bar

Create policy:

namespace "default" {
  variables {
    path "nomad/jobs/test1" {
      capabilities = ["read"]
    }
  }
}

Try calling variables in job "example"

job "example" {
  datacenters = ["dc1"]
  type        = "batch"

  periodic {
    cron             = "*/1 * * * * *"
    prohibit_overlap = true
  }

  group "test" {

    task "busybox" {
      driver = "docker"

      config {
        image   = "busybox"
        command = "sleep"
        args    = ["3000"]
      }
      template {
        data = <<EOH
{{ with nomadVar "nomad/jobs/test1" }}{{ .foo }}{{ end }}
EOH
        destination = "local/var.txt"
      }
    }
  }
}

@jrasell jrasell added theme/security stage/accepted Confirmed, and intend to work on. No timeline committment though. theme/variables Variables feature and removed stage/waiting-reply labels Feb 7, 2023
@jrasell
Copy link
Member

jrasell commented Feb 7, 2023

Hi @Hanmask21 and thanks for providing the additional details. I have been able to reproduce this locally, and it seems to be an issue with non-implicit policies that are assigned to parent jobs that can spawn children.

@jrasell jrasell removed their assignment Feb 8, 2023
@mr-karan
Copy link
Contributor

Hey. Are there any updates on this? I'm also unable to access Nomad variables in batch jobs when the path is not nomad/jobs/<jobname>. So effectively any path to access shared secret doesn't work with batch jobs.

@icyleaf
Copy link

icyleaf commented May 26, 2023

I'm also having this problem, any progress please? Since I enabled ACL, a lot of data backup jobs are not available anymore.

@tgross
Copy link
Member

tgross commented May 30, 2023

This issue should be closed by #17018, which will ship in the next version of Nomad (with backports)

@tgross tgross closed this as completed May 30, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
stage/accepted Confirmed, and intend to work on. No timeline committment though. theme/security theme/variables Variables feature type/bug
Projects
Development

No branches or pull requests

5 participants