Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

nomad status <LONG_JOBID> hits go-memdb's limitation UUID have 36 characters #3134

Closed
nak3 opened this issue Aug 30, 2017 · 2 comments · Fixed by #3138
Closed

nomad status <LONG_JOBID> hits go-memdb's limitation UUID have 36 characters #3134

nak3 opened this issue Aug 30, 2017 · 2 comments · Fixed by #3138

Comments

@nak3
Copy link
Contributor

nak3 commented Aug 30, 2017

Nomad version

Output from nomad version

$ nomad version
Nomad v0.6.3-dev

commit: aa9bb33

Operating system and Environment details

  • OS: Linux (Fedora 26)

Issue

Reproduction steps

1. create job with ${LONG_JOBID} > 36.

(eg.)

$ nomad status
ID                                                                      Type     Priority  Status   Submit Date
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa  service  50        pending  08/30/17 19:14:47 JST

2. run nomad status ${LONG_JOBID} and gets following error

$ nomad status aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa 
Error querying search with id: "Unexpected response code: 500 (alloc lookup failed: index error: Invalid UUID length. UUID have 36 characters; got 70)"

Additional info

  • After nomad status started using /v1/search endpoint, this issue was included. Previously nomad status used /v1/jobs/prefix with GET(client.query), and it was no problem. So, nomad job status <LONG_JOB_ID> could be a workaround for this issue.

  • More than 36 characters sounds crazy, but spark job automatically often assign such long job ID.

$ spark-submit   --class org.apache.spark.examples.JavaSparkPi   --master nomad   --deploy-mode cluster   --conf spark.executor.instances=4   --conf spark.nomad.sparkDistribution=https://s3.amazonaws.com/nomad-spark/spark-2.1.0-bin-nomad.tgz   https://s3.amazonaws.com/nomad-spark/spark-examples_2.11-2.1.0-SNAPSHOT.jar 100

$ nomad status
ID                                                              Type     Priority  Status   Submit Date
org.apache.spark.examples.JavaSparkPi-2017-08-30T07:51:04.567Z  batch    40        pending  08/30/17 16:51:05 JST

$ nomad status org.apache.spark.examples.JavaSparkPi-2017-08-30T07:51:04.567Z
Error querying search with id: "Unexpected response code: 500 (alloc lookup failed: index error: Invalid UUID length. UUID have 36 characters; got 58)"
@dadgar
Copy link
Contributor

dadgar commented Aug 30, 2017

Hey in the meantime please use nomad job status. This will be fixed in 0.6.3. Thanks for the report!

dadgar added a commit that referenced this issue Aug 30, 2017
This PR fixes an issue in which the Search endpoint would error if the
prefix was longer than the allowed 36 characters of a UUID.

Fixes #3134
@github-actions
Copy link

github-actions bot commented Dec 8, 2022

I'm going to lock this issue because it has been closed for 120 days ⏳. This helps our maintainers find and focus on the active issues.
If you have found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Dec 8, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants