Skip to content

Commit

Permalink
[Docs] Minor typo fixes
Browse files Browse the repository at this point in the history
Author: Nicholas Chammas <nicholas.chammas@gmail.com>

Closes #3772 from nchammas/patch-1 and squashes the following commits:

b7d9083 [Nicholas Chammas] [Docs] Minor typo fixes
  • Loading branch information
nchammas authored and pwendell committed Dec 23, 2014
1 parent a96b727 commit 0e532cc
Showing 1 changed file with 3 additions and 3 deletions.
6 changes: 3 additions & 3 deletions docs/submitting-applications.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ through a uniform interface so you don't have to configure your application spec
# Bundling Your Application's Dependencies
If your code depends on other projects, you will need to package them alongside
your application in order to distribute the code to a Spark cluster. To do this,
to create an assembly jar (or "uber" jar) containing your code and its dependencies. Both
create an assembly jar (or "uber" jar) containing your code and its dependencies. Both
[sbt](https://github.com/sbt/sbt-assembly) and
[Maven](http://maven.apache.org/plugins/maven-shade-plugin/)
have assembly plugins. When creating assembly jars, list Spark and Hadoop
Expand Down Expand Up @@ -59,7 +59,7 @@ for applications that involve the REPL (e.g. Spark shell).
Alternatively, if your application is submitted from a machine far from the worker machines (e.g.
locally on your laptop), it is common to use `cluster` mode to minimize network latency between
the drivers and the executors. Note that `cluster` mode is currently not supported for standalone
clusters, Mesos clusters, or python applications.
clusters, Mesos clusters, or Python applications.

For Python applications, simply pass a `.py` file in the place of `<application-jar>` instead of a JAR,
and add Python `.zip`, `.egg` or `.py` files to the search path with `--py-files`.
Expand Down Expand Up @@ -174,7 +174,7 @@ This can use up a significant amount of space over time and will need to be clea
is handled automatically, and with Spark standalone, automatic cleanup can be configured with the
`spark.worker.cleanup.appDataTtl` property.

For python, the equivalent `--py-files` option can be used to distribute `.egg`, `.zip` and `.py` libraries
For Python, the equivalent `--py-files` option can be used to distribute `.egg`, `.zip` and `.py` libraries
to executors.

# More Information
Expand Down

0 comments on commit 0e532cc

Please sign in to comment.