Skip to content

Workflows in existing clusters #797

Answered by sidharthbolar
gdonoso94 asked this question in Q&A
Discussion options

You must be logged in to vote

Yes there is
Above the spark_python_task:
Add the following

existing_cluster_id : ACTUAL_CLUSTER_ID

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@gdonoso94
Comment options

Answer selected by gdonoso94
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants