Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

aqua:apache/spark doesn't work #3990

Closed
jdx opened this issue Jan 8, 2025 · 1 comment · Fixed by #3995
Closed

aqua:apache/spark doesn't work #3990

jdx opened this issue Jan 8, 2025 · 1 comment · Fixed by #3995
Labels

Comments

@jdx
Copy link
Owner

jdx commented Jan 8, 2025

$ m x aqua:apache/spark -- which pyspark
    Finished `dev` profile [unoptimized + debuginfo] target(s) in 0.54s
     Running `target/debug/mise x 'aqua:apache/spark' -- which pyspark`
Error:
   0: failed to install aqua:apache/spark@latest
   1: HTTP status client error (404 Not Found) for url (https://dlcdn.apache.org/spark/spark-latest/spark-latest-bin-hadoop3.tgz)
   2: HTTP status client error (404 Not Found) for url (https://dlcdn.apache.org/spark/spark-latest/spark-latest-bin-hadoop3.tgz)
@roele
Copy link
Contributor

roele commented Jan 8, 2025

Seems like latest is not resolved which is not surprising as mise latest does not return anything. Current code relies on AquaPackage.version_source=github_tag to list versions or falls back to releases. The apache/spark registry has neither version_source set nor has it github releases.

Function _list_remote_versions should probably fall back to tags if there are no releases.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants