Skip to content

[QST] Number of Tasks per Executor question #5334

Answered by jlowe
eyalhir74 asked this question in General
Discussion options

You must be logged in to vote

The Spark scheduler is hanging because it cannot allocate the required resources for the executor. The configs are asking for each executor to have a GPU and each task to have 1/4 of a GPU, but no configs were specified on how Spark could locate any GPUs (i.e.: a GPU resource discovery script). See the Apache Spark documentation on custom resource scheduling for details.

If you only have one GPU then since you're running in local mode I would recommend removing the spark.executor.resource.gpu.amount and spark.task.resource.gpu.amount configs. If you have more than one GPU then you'll want to specify a GPU resource discovery script so the Spark scheduler can locate the requested GPU for th…

Replies: 5 comments

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Answer selected by sameerz
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
3 participants
Converted from issue

This discussion was converted from issue #4873 on April 27, 2022 15:40.