You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Please vote on this issue by adding a 👍 reaction to the original issue to help the community and maintainers prioritize this request
Please do not leave "+1" or other comments that do not add relevant new information or questions, they generate extra noise for issue followers and do not help prioritize the request
If you are interested in working on this issue or have submitted a pull request, please leave a comment
What is the outcome that you are trying to reach?
PySpark is used In many AI data cleaning scenarios, then in most of those scenarios, the --archives option or spark.archives configuration will be used to deal with Python package management. Therefore, we should consider adding a archives field in deps field of SparkApplication CRD to support this frequently used spark param.
Describe the solution you would like
update the crd defination by adding the archives field
add --archives params when executing spark-submit command in operator
Describe alternatives you have considered
Additional context
If needed, I can submit a PR
The text was updated successfully, but these errors were encountered:
kaka-zb
changed the title
[FEATURE] Brief description of the feature
[FEATURE] Support --archives option in SparkApplication CRD and spark-submit
Oct 14, 2024
Community Note
What is the outcome that you are trying to reach?
PySpark is used In many AI data cleaning scenarios, then in most of those scenarios, the --archives option or spark.archives configuration will be used to deal with Python package management. Therefore, we should consider adding a archives field in deps field of SparkApplication CRD to support this frequently used spark param.
Describe the solution you would like
Describe alternatives you have considered
Additional context
If needed, I can submit a PR
The text was updated successfully, but these errors were encountered: