Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DATs should verify non-default namespace behavior #17105

Closed
edgao opened this issue Sep 23, 2022 · 1 comment · Fixed by #20775
Closed

DATs should verify non-default namespace behavior #17105

edgao opened this issue Sep 23, 2022 · 1 comment · Fixed by #20775
Assignees
Labels
team/destinations Destinations team's backlog

Comments

@edgao
Copy link
Contributor

edgao commented Sep 23, 2022

from https://docs.google.com/document/d/15xLW97ZqPnttLluP80ZCqXxCHHzB-vn8bhBBpUmjGMI/edit#

Apparently it's possible for a destination to only write data into its default namespace and still pass our DATs. We should add a test case that verifies it can write to a non-default namespace. I.e. if the catalog looks like:

{
  "streams": [
    {
      "stream": {
        "name": "whatever",
        "namespace": "some_non_default_namespace",
        "json_schema": {...}
      }
    }
  ]
}

Then the DATs should verify that after a sync, a table called some_non_default_namespace.whatever was created.

Details

It looks like this test case already exists -

Let's look into why it wasn't failing prior to #17054, and fix that. (A likely avenue of investigation: should this sourcenamespace be randomized per test run?)

In order to run BigQueryGcsDestinationAcceptanceTest locally, you'll need to manually create airbyte-integrations/connectors/destination-bigquery/secrets/credentials.json, using the contents here.

And you can explore our integration test BigQuery instance here.

@grishick grishick added the team/destinations Destinations team's backlog label Sep 27, 2022
@jbfbell jbfbell self-assigned this Dec 9, 2022
@jbfbell
Copy link
Contributor

jbfbell commented Dec 20, 2022

Continued Investigating this with @edgao today and discovered that the integration testing for the big query destination does not cover the standard inserts mode and only covers staging (GCS). The Bug from the original oncall issue only affected the standard inserts mode, thus the test is passing because we are not executing the code which contained the bug (which has since been resolved )

Disregard the comment above. I identified an issue with my airbyte setup such that the docker containers were not being rebuilt despite code changes. With rebuilt docker containers this works as @edgao suspects - by randomizing the dataset id the test will fail. This is implemented in #20775

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
team/destinations Destinations team's backlog
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants