Skip to content

Commit

Permalink
Releasing 2.11.0
Browse files Browse the repository at this point in the history
  • Loading branch information
EnricoMi committed Jan 4, 2024
1 parent e63a488 commit b65bbeb
Show file tree
Hide file tree
Showing 5 changed files with 18 additions and 18 deletions.
2 changes: 1 addition & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ All notable changes to this project will be documented in this file.

The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).

## [UNRELEASED] - YYYY-MM-DD
## [2.11.0] - 2024-01-04

### Added

Expand Down
20 changes: 10 additions & 10 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -198,7 +198,7 @@ The package version has the following semantics: `spark-extension_{SCALA_COMPAT_
Add this line to your `build.sbt` file:

```sbt
libraryDependencies += "uk.co.gresearch.spark" %% "spark-extension" % "2.10.0-3.5"
libraryDependencies += "uk.co.gresearch.spark" %% "spark-extension" % "2.11.0-3.5"
```

### Maven
Expand All @@ -209,7 +209,7 @@ Add this dependency to your `pom.xml` file:
<dependency>
<groupId>uk.co.gresearch.spark</groupId>
<artifactId>spark-extension_2.12</artifactId>
<version>2.10.0-3.5</version>
<version>2.11.0-3.5</version>
</dependency>
```

Expand All @@ -219,7 +219,7 @@ Add this dependency to your `build.gradle` file:

```groovy
dependencies {
implementation "uk.co.gresearch.spark:spark-extension_2.12:2.10.0-3.5"
implementation "uk.co.gresearch.spark:spark-extension_2.12:2.11.0-3.5"
}
```

Expand All @@ -228,7 +228,7 @@ dependencies {
Submit your Spark app with the Spark Extension dependency (version ≥1.1.0) as follows:

```shell script
spark-submit --packages uk.co.gresearch.spark:spark-extension_2.12:2.10.0-3.5 [jar]
spark-submit --packages uk.co.gresearch.spark:spark-extension_2.12:2.11.0-3.5 [jar]
```

Note: Pick the right Scala version (here 2.12) and Spark version (here 3.5) depending on your Spark version.
Expand All @@ -238,7 +238,7 @@ Note: Pick the right Scala version (here 2.12) and Spark version (here 3.5) depe
Launch a Spark Shell with the Spark Extension dependency (version ≥1.1.0) as follows:

```shell script
spark-shell --packages uk.co.gresearch.spark:spark-extension_2.12:2.10.0-3.5
spark-shell --packages uk.co.gresearch.spark:spark-extension_2.12:2.11.0-3.5
```

Note: Pick the right Scala version (here 2.12) and Spark version (here 3.5) depending on your Spark Shell version.
Expand All @@ -254,7 +254,7 @@ from pyspark.sql import SparkSession

spark = SparkSession \
.builder \
.config("spark.jars.packages", "uk.co.gresearch.spark:spark-extension_2.12:2.10.0-3.5") \
.config("spark.jars.packages", "uk.co.gresearch.spark:spark-extension_2.12:2.11.0-3.5") \
.getOrCreate()
```

Expand All @@ -265,7 +265,7 @@ Note: Pick the right Scala version (here 2.12) and Spark version (here 3.5) depe
Launch the Python Spark REPL with the Spark Extension dependency (version ≥1.1.0) as follows:

```shell script
pyspark --packages uk.co.gresearch.spark:spark-extension_2.12:2.10.0-3.5
pyspark --packages uk.co.gresearch.spark:spark-extension_2.12:2.11.0-3.5
```

Note: Pick the right Scala version (here 2.12) and Spark version (here 3.5) depending on your PySpark version.
Expand All @@ -275,7 +275,7 @@ Note: Pick the right Scala version (here 2.12) and Spark version (here 3.5) depe
Run your Python scripts that use PySpark via `spark-submit`:

```shell script
spark-submit --packages uk.co.gresearch.spark:spark-extension_2.12:2.10.0-3.5 [script.py]
spark-submit --packages uk.co.gresearch.spark:spark-extension_2.12:2.11.0-3.5 [script.py]
```

Note: Pick the right Scala version (here 2.12) and Spark version (here 3.5) depending on your Spark version.
Expand All @@ -289,7 +289,7 @@ Running your Python application on a Spark cluster will still require one of the
to add the Scala package to the Spark environment.

```shell script
pip install pyspark-extension==2.10.0.3.5
pip install pyspark-extension==2.11.0.3.5
```

Note: Pick the right Spark version (here 3.5) depending on your PySpark version.
Expand All @@ -299,7 +299,7 @@ Note: Pick the right Spark version (here 3.5) depending on your PySpark version.
There are plenty of [Data Science notebooks](https://datasciencenotebook.org/) around. To use this library,
add **a jar dependency** to your notebook using these **Maven coordinates**:

uk.co.gresearch.spark:spark-extension_2.12:2.10.0-3.5
uk.co.gresearch.spark:spark-extension_2.12:2.11.0-3.5

Or [download the jar](https://mvnrepository.com/artifact/uk.co.gresearch.spark/spark-extension) and place it
on a filesystem where it is accessible by the notebook, and reference that jar file directly.
Expand Down
2 changes: 1 addition & 1 deletion pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
<modelVersion>4.0.0</modelVersion>
<groupId>uk.co.gresearch.spark</groupId>
<artifactId>spark-extension_2.13</artifactId>
<version>2.11.0-3.5-SNAPSHOT</version>
<version>2.11.0-3.5</version>
<name>Spark Extension</name>
<description>A library that provides useful extensions to Apache Spark.</description>
<inceptionYear>2020</inceptionYear>
Expand Down
10 changes: 5 additions & 5 deletions python/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -72,7 +72,7 @@ Running your Python application on a Spark cluster will still require one of the
to add the Scala package to the Spark environment.

```shell script
pip install pyspark-extension==2.10.0.3.4
pip install pyspark-extension==2.11.0.3.4
```

Note: Pick the right Spark version (here 3.4) depending on your PySpark version.
Expand All @@ -86,7 +86,7 @@ from pyspark.sql import SparkSession

spark = SparkSession \
.builder \
.config("spark.jars.packages", "uk.co.gresearch.spark:spark-extension_2.12:2.10.0-3.4") \
.config("spark.jars.packages", "uk.co.gresearch.spark:spark-extension_2.12:2.11.0-3.4") \
.getOrCreate()
```

Expand All @@ -97,7 +97,7 @@ Note: Pick the right Scala version (here 2.12) and Spark version (here 3.4) depe
Launch the Python Spark REPL with the Spark Extension dependency (version ≥1.1.0) as follows:

```shell script
pyspark --packages uk.co.gresearch.spark:spark-extension_2.12:2.10.0-3.4
pyspark --packages uk.co.gresearch.spark:spark-extension_2.12:2.11.0-3.4
```

Note: Pick the right Scala version (here 2.12) and Spark version (here 3.4) depending on your PySpark version.
Expand All @@ -107,7 +107,7 @@ Note: Pick the right Scala version (here 2.12) and Spark version (here 3.4) depe
Run your Python scripts that use PySpark via `spark-submit`:

```shell script
spark-submit --packages uk.co.gresearch.spark:spark-extension_2.12:2.10.0-3.4 [script.py]
spark-submit --packages uk.co.gresearch.spark:spark-extension_2.12:2.11.0-3.4 [script.py]
```

Note: Pick the right Scala version (here 2.12) and Spark version (here 3.4) depending on your Spark version.
Expand All @@ -117,7 +117,7 @@ Note: Pick the right Scala version (here 2.12) and Spark version (here 3.4) depe
There are plenty of [Data Science notebooks](https://datasciencenotebook.org/) around. To use this library,
add **a jar dependency** to your notebook using these **Maven coordinates**:

uk.co.gresearch.spark:spark-extension_2.12:2.10.0-3.4
uk.co.gresearch.spark:spark-extension_2.12:2.11.0-3.4

Or [download the jar](https://mvnrepository.com/artifact/uk.co.gresearch.spark/spark-extension) and place it
on a filesystem where it is accessible by the notebook, and reference that jar file directly.
Expand Down
2 changes: 1 addition & 1 deletion python/setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@
from pathlib import Path
from setuptools import setup

jar_version = '2.11.0-3.5-SNAPSHOT'
jar_version = '2.11.0-3.5'
scala_version = '2.13.8'
scala_compat_version = '.'.join(scala_version.split('.')[:2])
spark_compat_version = jar_version.split('-')[1]
Expand Down

0 comments on commit b65bbeb

Please sign in to comment.