Skip to content

Commit

Permalink
Minor rework of Python structure in READMEs
Browse files Browse the repository at this point in the history
  • Loading branch information
EnricoMi committed Mar 28, 2023
1 parent 85a5406 commit 032ecc6
Show file tree
Hide file tree
Showing 2 changed files with 14 additions and 14 deletions.
26 changes: 13 additions & 13 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -123,19 +123,7 @@ spark-submit --packages uk.co.gresearch.spark:spark-extension_2.12:2.5.0-3.3 [sc

Note: Pick the right Scala version (here 2.12) and Spark version (here 3.3) depending on your Spark version.

### Your favorite Data Science notebook

There are plenty of [Data Science notebooks](https://datasciencenotebook.org/) around. To use this library,
add **a jar dependency** to your notebook using these **Maven coordinates**:

uk.co.gresearch.spark:spark-extension_2.12:2.5.0-3.3

Or [download the jar](https://mvnrepository.com/artifact/uk.co.gresearch.spark/spark-extension) and place it
on a filesystem where it is accessible by the notebook, and reference that jar file directly.

Check the documentation of your favorite notebook to learn how to add jars to your Spark environment.

### PyPi package (local Spark cluster only)
#### PyPi package (local Spark cluster only)

You may want to install the `pyspark-extension` python package from PyPi into your development environment.
This provides you code completion, typing and test capabilities during your development phase.
Expand All @@ -149,6 +137,18 @@ pip install pyspark-extension==2.5.0.3.3

Note: Pick the right Spark version (here 3.3) depending on your PySpark version.

### Your favorite Data Science notebook

There are plenty of [Data Science notebooks](https://datasciencenotebook.org/) around. To use this library,
add **a jar dependency** to your notebook using these **Maven coordinates**:

uk.co.gresearch.spark:spark-extension_2.12:2.5.0-3.3

Or [download the jar](https://mvnrepository.com/artifact/uk.co.gresearch.spark/spark-extension) and place it
on a filesystem where it is accessible by the notebook, and reference that jar file directly.

Check the documentation of your favorite notebook to learn how to add jars to your Spark environment.

## Build

You can build this project against different versions of Spark and Scala.
Expand Down
2 changes: 1 addition & 1 deletion python/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ For details, see the [README.md](https://github.com/G-Research/spark-extension#s

## Using Spark Extension

### PyPi package (local Spark cluster only)
#### PyPi package (local Spark cluster only)

You may want to install the `pyspark-extension` python package from PyPi into your development environment.
This provides you code completion, typing and test capabilities during your development phase.
Expand Down

0 comments on commit 032ecc6

Please sign in to comment.