Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CLI options to follow/monitor deployments to completion #6818

Closed
schmichael opened this issue Dec 6, 2019 · 7 comments · Fixed by #10661
Closed

CLI options to follow/monitor deployments to completion #6818

schmichael opened this issue Dec 6, 2019 · 7 comments · Fixed by #10661

Comments

@schmichael
Copy link
Member

Nomad version

Nomad 0.10.2

Issue

nomad deployment status lacks to ability to block until a deployment is complete similar to log following/tailing or drain monitoring.

nomad job run monitors by default but not until a deployment is complete.

Proposal

Add -monitor command to nomad deployment status that blocks until deployment completion. nomad drain -monitor has proven inadequate for fully understanding the drain process, so it might be a better idea to simply print the deployment's status output every time an update occurs.

A stretch goal would be detecting a tty and updating the output in place vs printing new output on each update, although that would also require a flag to disable (or proper detection of piping).

Another stretch goal would be adding a -monitor flag to nomad job run that would effectively switch from the current job run output to the new deployment status -monitor output after the normal job run output completed.

Prior art

@krishicks
Copy link
Contributor

krishicks commented Dec 6, 2019

simply print the deployment's status output every time an update occurs.

I agree with this, though a CLI user might want updates even when a change hasn't occurred just to show the process isn't just hanging. I implemented this status checking without going the extra step of writing something out even if the status doesn't change and it doesn't feel right.

I still think we should limit the API calls, but hide that fact behind some constant progress updates. (I believe this is also what kubectl rollout status does.)

This is my implementation: https://github.com/hashicorp/cloud-sre/blob/master/cli/lib/remote/command_watch.sh

@krishicks
Copy link
Contributor

I've added a --watch flag to Cloud's deploy command which includes output that mimics nomad's own output:

==> Monitoring evaluation "e7bef548"
    Evaluation triggered by job "cadence-worker"
    Evaluation within deployment: "502d4768"
    Evaluation status changed: "pending" -> "complete"
==> Evaluation "e7bef548" finished with status "complete"
==> Monitoring deployment "502d4768"
    Deployment updated: status is "running"
    Deployment updated: status is "running"
    Deployment updated: status is "running"
    Deployment updated: status is "running"
    Deployment updated: status is "running"
    Deployment updated: status is "running"
    Deployment updated: status is "successful"

@tgross tgross modified the milestones: near-term, unscheduled Jan 9, 2020
@OneCricketeer
Copy link

Is this exposed via the API at all?

@tgross
Copy link
Member

tgross commented Mar 27, 2020

Hi @Cricket007! The existing nomad node drain -monitor works by polling the v1/node/:node_id endpoint (see api/nodes.go#L216 for the implementation). Any deployment monitoring would likely work the same way.

@tgross tgross added this to Needs Roadmapping in Nomad - Community Issues Triage Feb 12, 2021
@tgross tgross removed this from the unscheduled milestone Feb 12, 2021
@tgross tgross removed this from Needs Roadmapping in Nomad - Community Issues Triage Mar 4, 2021
@tcurdt
Copy link

tcurdt commented Apr 11, 2021

Is there any chance to also see the allocations?

Either way: I would love to see this get released. Are there any plans for a release yet?

@tgross
Copy link
Member

tgross commented May 5, 2021

Hi @tcurdt! Unfortunately we don't yet have a public roadmap for Nomad, so I can't promise you when this will land. But it's definitely something we want to do.

@github-actions
Copy link

I'm going to lock this issue because it has been closed for 120 days ⏳. This helps our maintainers find and focus on the active issues.
If you have found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Oct 18, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

6 participants