Delta Lake 1.2.1
We are excited to announce the release of Delta Lake 1.2.1 on Apache Spark 3.2. Similar to Apache Spark™, we have released Maven artifacts for both Scala 2.12 and Scala 2.13.
- Documentation: https://docs.delta.io/1.2.1/index.html
- Maven artifacts: delta-core_2.12, delta-core_2.13, delta-contribs_2.12 delta_contribs_2.13, delta-storage, delta-storage-s3-dynamodb
- Python artifacts: https://pypi.org/project/delta-spark/1.2.1/
Key features in this release
- Fix an issue with loading error messages in
--packages
mode. Previous release had a bug that resulted in user gettingNullPointerException
instead of proper error message when using Delta Lake with--packages
mode either inpyspark
orspark-shell
(Fix, Test) - Fix incorrect exception type thrown in some Python APIs. A bug caused
pyspark
to throw incorrect type of exceptions instead of expectedAnalysisException
. This issue is fixed. See issue #1086 for more details. - Fix for S3 multi-cluster mode configuration. A bug in the S3 multi-cluster mode caused
--conf
to not work for certain configuration parameters. This issue is fixed by having these configuration parameters begin withspark
. See the updated documentation. - Make the GCS LogStore configuration simpler by automatically deriving the
LogStore
implementation class configspark.delta.logStore.gs.impl
from the scheme in the table path. See the updated documentation. - Make SetAccumulator thread safe. SetAccumulator used by Merge was not thread safe and might cause executor heartbeat failures in rare cases. This was fixed by using a synchronized set.
Credits
Allison Portis, Chang Yong Lik, Kam Cheung Ting, Rahul Mahadev, Scott Sandre, Venki Korukanti