Releases: dbt-labs/spark-utils
spark-utils v0.3.0
This release supports any version (minor and patch) of v1, which means far less need for compatibility releases in the future.
Features
- Add macros for common maintenance operations (#18)
Contributors
spark-utils v0.2.4
🚨 This is a compatibility (patch) release in preparation for dbt-core v1.0.0 (🎉)
spark-utils v0.2.3
spark-utils v0.2.2
This is a patch release, tracking updates in dbt v0.20.0 and dbt-utils v0.7.0. Nothing in this release represents a breaking change.
Under-the-hood changes from #15:
- More explicit type casting in
dateadd
anddatediff
to get along with Spark3 - Add
spark__concat
, to account for changes in dbt-utils v0.7.0 (backwards compatible for older versions, too) - Use latest
adapter.dispatch
syntax in integration tests - Replace references to
fishtown-analytics
withdbt-labs
spark-utils 0.2.1
Fix spark_utils.assert_not_null
(used by spark__dateadd
and spark__datediff
) to raise an error only if the date is implicitly null (a string representing a non-real date), rather than explicitly null
.
Thanks @foundinblank for the fix!
spark-utils 0.2.0
More Apache Spark "shims" for dbt_utils
.
In particular, there is now full compatibility for all user-facing macros. Notably, it adds get_relations_by_pattern
+ get_relations_by_prefix
. Note that the Spark implementations do not support schema name patterns, only relation name patterns with supported wildcards.
I don't believe there are any breaking changes relative to 0.1.0, but I'm bumping the minor version because:
- I significantly reworked
spark__split_part
to support special characters - This package is still in early development + testing
spark-utils 0.1.0
Initial release to provide Spark "shims" for other popular dbt packages. In particular, this release:
- implements a subset of
dbt_utils
macros - enables the
snowplow
sessionization package to run on Delta Lake (Databricks)
In the future, this package may also include some "nice-to-have" macros of particular interest to dbt-spark users.