-
Notifications
You must be signed in to change notification settings - Fork 4.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Make schema field in source-snowflake mean a subset of the specified o… #20465
Make schema field in source-snowflake mean a subset of the specified o… #20465
Conversation
…f schema when during discover(). update UI
Affected Connector ReportNOTE
|
Connector | Version | Changelog | Publish |
---|---|---|---|
source-alloydb |
1.0.34 |
✅ | ✅ |
source-alloydb-strict-encrypt |
1.0.34 |
🔵 (ignored) |
🔵 (ignored) |
source-bigquery |
0.2.3 |
✅ | ✅ |
source-clickhouse |
0.1.14 |
✅ | ✅ |
source-clickhouse-strict-encrypt |
0.1.14 |
🔵 (ignored) |
🔵 (ignored) |
source-cockroachdb |
0.1.18 |
✅ | ✅ |
source-cockroachdb-strict-encrypt |
0.1.18 |
🔵 (ignored) |
🔵 (ignored) |
source-db2 |
0.1.16 |
✅ | ✅ |
source-db2-strict-encrypt |
0.1.16 |
🔵 (ignored) |
🔵 (ignored) |
source-dynamodb |
0.1.0 |
✅ | ✅ |
source-e2e-test |
2.1.3 |
✅ | ✅ |
source-e2e-test-cloud |
2.1.1 |
🔵 (ignored) |
🔵 (ignored) |
source-elasticsearch |
0.1.1 |
✅ | ✅ |
source-jdbc |
0.3.5 |
🔵 (ignored) |
🔵 (ignored) |
source-kafka |
0.2.3 |
✅ | ✅ |
source-mongodb-strict-encrypt |
0.1.19 |
🔵 (ignored) |
🔵 (ignored) |
source-mongodb-v2 |
0.1.19 |
✅ | ✅ |
source-mssql |
0.4.26 |
✅ | ✅ |
source-mssql-strict-encrypt |
0.4.26 |
🔵 (ignored) |
🔵 (ignored) |
source-mysql |
1.0.18 |
✅ | ✅ |
source-mysql-strict-encrypt |
1.0.18 |
🔵 (ignored) |
🔵 (ignored) |
source-oracle |
0.3.21 |
✅ | ✅ |
source-oracle-strict-encrypt |
0.3.21 |
🔵 (ignored) |
🔵 (ignored) |
source-postgres |
1.0.35 |
✅ | ✅ |
source-postgres-strict-encrypt |
1.0.35 |
🔵 (ignored) |
🔵 (ignored) |
source-redshift |
0.3.15 |
✅ | ✅ |
source-scaffold-java-jdbc |
0.1.0 |
🔵 (ignored) |
🔵 (ignored) |
source-sftp |
0.1.2 |
✅ | ✅ |
source-snowflake |
0.1.28 |
✅ | ❌ (diff seed version) |
source-tidb |
0.2.1 |
✅ | ✅ |
- See "Actionable Items" below for how to resolve warnings and errors.
✅ Destinations (0)
Connector | Version | Changelog | Publish |
---|
- See "Actionable Items" below for how to resolve warnings and errors.
✅ Other Modules (0)
Actionable Items
(click to expand)
Category | Status | Actionable Item |
---|---|---|
Version | ❌ mismatch |
The version of the connector is different from its normal variant. Please bump the version of the connector. |
⚠ doc not found |
The connector does not seem to have a documentation file. This can be normal (e.g. basic connector like source-jdbc is not published or documented). Please double-check to make sure that it is not a bug. |
|
Changelog | ⚠ doc not found |
The connector does not seem to have a documentation file. This can be normal (e.g. basic connector like source-jdbc is not published or documented). Please double-check to make sure that it is not a bug. |
❌ changelog missing |
There is no chnagelog for the current version of the connector. If you are the author of the current version, please add a changelog. | |
Publish | ⚠ not in seed |
The connector is not in the seed file (e.g. source_definitions.yaml ), so its publication status cannot be checked. This can be normal (e.g. some connectors are cloud-specific, and only listed in the cloud seed file). Please double-check to make sure that it is not a bug. |
❌ diff seed version |
The connector exists in the seed file, but the latest version is not listed there. This usually means that the latest version is not published. Please use the /publish command to publish the latest version. |
/test connector=connectors/source-snowflake
Build FailedTest summary info:
|
.collect(Collectors.toMap(AirbyteStream::getName, s -> s)); | ||
.collect(Collectors.toMap( | ||
s -> | ||
"%s.%s".formatted(s.getNamespace(), s.getName()), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This change makes the test more robust in case previous runs left junk schemas or in case multiple instances of acceptance test are running at the same time.
/test connector=connectors/source-snowflake
Build FailedTest summary info:
|
/test connector=connectors/source-snowflake
|
/test connector=connectors/source-snowflake
Build FailedTest summary info:
|
/test connector=connectors/source-snowflake
Build PassedTest summary info:
|
@@ -12691,7 +12691,7 @@ | |||
"$schema": "http://json-schema.org/draft-07/schema#", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@grishick , @evantahler Do we need to make changes to connector_catalog as well?
…-schema-not-used' into 20018-snowflake-source-connector-schema-not-used
Going over metabase I'm seeing the following active cloud source-snowflake connections:
|
/test connector=connectors/source-snowflake
Build FailedTest summary info:
|
/test connector=connectors/source-snowflake
Build PassedTest summary info:
|
/publish connector=connectors/source-snowflake
if you have connectors that successfully published but failed definition generation, follow step 4 here |
…o… (#20465) * Make schema field in source-postgres mean a subset of the specified of schema when during discover(). update UI * Add missing file * Fix failing acceptance test * sanity * update doc * typo * version bump and release note * Fix failing test * fix format
trying snowflake 1.28 source version, am i missing anything? |
What
the schema field in source-snowflake is not very useful today:
Regardless of what the user inputs in schema, all tables from all schemas are discovered and included in catalog.
This change changes its behavior to be more in line with other source connectors.
How
If
schema
is specified, the catalog discovery will be limited to only tables included in this schema.If field left open, all tables from all schema will be included in catalog - similarly to today's discovery.
Recommended reading order
spec,json
SnowflakeDataSourceUtils.java
🚨 User Impact 🚨
This is going to have in impact, as source-snowflake connectors today have a mandatory schema field, but replication might include tables from outside the specified schema.
These customers needs to be instructed to leave the schema field empty upon upgrade (field is mandatory today).
Existing connection that only sync tables from the specified schema should not have a problem.
>>>>>>>> We need to be carful in publishing this in order not to break existing connections
Do not merge until we have a go ahead from TCS!!
cc: @erica-airbyte