-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Epic] Applied State (part 2) #9425
Comments
@graciegoheen / @MichelleArk sorry for the direct Q on this but struggling to understand from docs / issue / PRs. Has the support for batch metadata collection been added for BQ specifically? I saw the general work in dbt-adapters and snowflake and redshift specific but I don't think BQ? Thank you |
@adamcunnington-mlg I believe so, yes: |
@jtcohen6 I think that is just the initial support and not the batch-route - which is the critical bit. Please advise - many thanks |
Hey @adamcunnington-mlg -- I've summarized some spiking done to evaluate the cost/benefit of implementing a batch-route for metadata freshness in BigQuery here: dbt-labs/dbt-bigquery#938. There are more details in the spike report, but my overall conclusion is that there isn't currently a way to implement a batch-strategy that achieves performance improvements for metadata-based source freshness given limitations of BigQuery's Python SDK. |
@MichelleArk I believe this is a missed conclusion here. I've left some details against your more comprehensive response; dbt-labs/dbt-bigquery#938 (comment) These details are also in the original FR; #7012 (comment) |
This epic comprises the remaining work on the Applied State initiative to support better visibility into the current state of the database in a more performant way.
dbt-core 1.8
loaded_at_field
set tonull
at table level to overwrite default value set at the source level #9320next
loaded_at_field
#9979The text was updated successfully, but these errors were encountered: