-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[FEATURE] Publish docker artifacts for multiple Spark versions with each release #1982
Comments
I would be happy to work on this, if the feature is desired. |
Key questions prior to working:
|
…ons on release Signed-off-by: Peter Jablonski <mcclonski.peter@gmail.com> Signed-off-by: Peter McClonski <mcclonski.peter@gmail.com>
…ons on release Signed-off-by: Peter Jablonski <mcclonski.peter@gmail.com> Signed-off-by: Peter McClonski <mcclonski.peter@gmail.com>
This issue has been automatically marked as stale because it has been open 60 days with no activity. Remove stale label or comment or this will be closed in 30 days. Thank you for your contributions. |
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. |
This issue has been automatically closed because it has not had recent activity. Please comment "/reopen" to reopen it. |
Community Note
What is the outcome that you are trying to reach?
Currently, each release of Spark Operator is associated with exactly one version of Spark-- Generally the latest minor/patch version. Our organization is using Spark 3.4.2-- released Nov. '23. We would like to use the most recent fixes and features present in Spark Operator without being forced to use a single, specific version of Spark.
Describe the solution you would like
In particular, we would like to see releases of Spark Operator include support for multiple versions of Spark. The length of support is likely a point for debate, but we would propose supporting the most recent patch version of the three most recent minor versions.
Describe alternatives you have considered
Alternatives include either forcing a project to align their spark version with the spark-operator spark version, or forcing the project to build and maintain their own version of the spark-operator docker image.
Additional context
The text was updated successfully, but these errors were encountered: