From 032ecc6dd10f19168f9a904665435fe910e1c7a0 Mon Sep 17 00:00:00 2001 From: Enrico Minack Date: Tue, 28 Mar 2023 14:21:36 +0200 Subject: [PATCH] Minor rework of Python structure in READMEs --- README.md | 26 +++++++++++++------------- python/README.md | 2 +- 2 files changed, 14 insertions(+), 14 deletions(-) diff --git a/README.md b/README.md index 3b2b69ef..5acd9171 100644 --- a/README.md +++ b/README.md @@ -123,19 +123,7 @@ spark-submit --packages uk.co.gresearch.spark:spark-extension_2.12:2.5.0-3.3 [sc Note: Pick the right Scala version (here 2.12) and Spark version (here 3.3) depending on your Spark version. -### Your favorite Data Science notebook - -There are plenty of [Data Science notebooks](https://datasciencenotebook.org/) around. To use this library, -add **a jar dependency** to your notebook using these **Maven coordinates**: - - uk.co.gresearch.spark:spark-extension_2.12:2.5.0-3.3 - -Or [download the jar](https://mvnrepository.com/artifact/uk.co.gresearch.spark/spark-extension) and place it -on a filesystem where it is accessible by the notebook, and reference that jar file directly. - -Check the documentation of your favorite notebook to learn how to add jars to your Spark environment. - -### PyPi package (local Spark cluster only) +#### PyPi package (local Spark cluster only) You may want to install the `pyspark-extension` python package from PyPi into your development environment. This provides you code completion, typing and test capabilities during your development phase. @@ -149,6 +137,18 @@ pip install pyspark-extension==2.5.0.3.3 Note: Pick the right Spark version (here 3.3) depending on your PySpark version. +### Your favorite Data Science notebook + +There are plenty of [Data Science notebooks](https://datasciencenotebook.org/) around. To use this library, +add **a jar dependency** to your notebook using these **Maven coordinates**: + + uk.co.gresearch.spark:spark-extension_2.12:2.5.0-3.3 + +Or [download the jar](https://mvnrepository.com/artifact/uk.co.gresearch.spark/spark-extension) and place it +on a filesystem where it is accessible by the notebook, and reference that jar file directly. + +Check the documentation of your favorite notebook to learn how to add jars to your Spark environment. + ## Build You can build this project against different versions of Spark and Scala. diff --git a/python/README.md b/python/README.md index 9ad2cd65..3b9162a9 100644 --- a/python/README.md +++ b/python/README.md @@ -13,7 +13,7 @@ For details, see the [README.md](https://github.com/G-Research/spark-extension#s ## Using Spark Extension -### PyPi package (local Spark cluster only) +#### PyPi package (local Spark cluster only) You may want to install the `pyspark-extension` python package from PyPi into your development environment. This provides you code completion, typing and test capabilities during your development phase.