Skip to content

Commit

Permalink
SPARK-2641: Passing num executors to spark arguments from properties …
Browse files Browse the repository at this point in the history
…file

Since we can set spark executor memory and executor cores using property file, we must also be allowed to set the executor instances.

Author: Kanwaljit Singh <kanwaljit.singh@guavus.com>

Closes #1657 from kjsingh/branch-1.0 and squashes the following commits:

d8a5a12 [Kanwaljit Singh] SPARK-2641: Fixing how spark arguments are loaded from properties file for num executors
  • Loading branch information
Kanwaljit Singh authored and Andrew Or committed Dec 20, 2014
1 parent e0fc0c5 commit 44719e6
Showing 1 changed file with 2 additions and 0 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -105,6 +105,8 @@ private[spark] class SparkSubmitArguments(args: Seq[String]) {
.getOrElse(defaultProperties.get("spark.cores.max").orNull)
name = Option(name).getOrElse(defaultProperties.get("spark.app.name").orNull)
jars = Option(jars).getOrElse(defaultProperties.get("spark.jars").orNull)
numExecutors = Option(numExecutors)
.getOrElse(defaultProperties.get("spark.executor.instances").orNull)

// This supports env vars in older versions of Spark
master = Option(master).getOrElse(System.getenv("MASTER"))
Expand Down

0 comments on commit 44719e6

Please sign in to comment.