Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FEATURE] Publish docker artifacts for multiple Spark versions with each release #1982

Closed
peter-mcclonski opened this issue Apr 16, 2024 · 5 comments
Labels
enhancement New feature or request lifecycle/stale

Comments

@peter-mcclonski
Copy link
Contributor

Community Note

  • Please vote on this issue by adding a 👍 reaction to the original issue to help the community and maintainers prioritize this request
  • Please do not leave "+1" or other comments that do not add relevant new information or questions, they generate extra noise for issue followers and do not help prioritize the request
  • If you are interested in working on this issue or have submitted a pull request, please leave a comment

What is the outcome that you are trying to reach?

Currently, each release of Spark Operator is associated with exactly one version of Spark-- Generally the latest minor/patch version. Our organization is using Spark 3.4.2-- released Nov. '23. We would like to use the most recent fixes and features present in Spark Operator without being forced to use a single, specific version of Spark.

Describe the solution you would like

In particular, we would like to see releases of Spark Operator include support for multiple versions of Spark. The length of support is likely a point for debate, but we would propose supporting the most recent patch version of the three most recent minor versions.

Describe alternatives you have considered

Alternatives include either forcing a project to align their spark version with the spark-operator spark version, or forcing the project to build and maintain their own version of the spark-operator docker image.

Additional context

@peter-mcclonski peter-mcclonski added the enhancement New feature or request label Apr 16, 2024
@peter-mcclonski
Copy link
Contributor Author

I would be happy to work on this, if the feature is desired.

@peter-mcclonski
Copy link
Contributor Author

Key questions prior to working:

  • What criteria should be used to determine which Spark version(s) to support with a given release?
  • What criteria should be used to determine the "default" Spark version in the helm chart?

peter-mcclonski added a commit to peter-mcclonski/spark-on-k8s-operator that referenced this issue May 10, 2024
…ons on release

Signed-off-by: Peter Jablonski <mcclonski.peter@gmail.com>
Signed-off-by: Peter McClonski <mcclonski.peter@gmail.com>
peter-mcclonski added a commit to peter-mcclonski/spark-on-k8s-operator that referenced this issue May 10, 2024
…ons on release

Signed-off-by: Peter Jablonski <mcclonski.peter@gmail.com>
Signed-off-by: Peter McClonski <mcclonski.peter@gmail.com>
Copy link

This issue has been automatically marked as stale because it has been open 60 days with no activity. Remove stale label or comment or this will be closed in 30 days. Thank you for your contributions.

Copy link

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

Copy link

This issue has been automatically closed because it has not had recent activity. Please comment "/reopen" to reopen it.

@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale Nov 11, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request lifecycle/stale
Projects
None yet
Development

No branches or pull requests

2 participants