Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

airbyte-ci: refactor gradle containers, use amazoncorretto #30384

Merged
merged 15 commits into from
Sep 13, 2023

Conversation

postamar
Copy link
Contributor

This PR collects a bunch of airbyte-ci improvements which were discovered by @alafanechere and myself while on a bug hunt. These improvements are centred on the GradleTask.

@postamar
Copy link
Contributor Author

postamar commented Sep 13, 2023

Testing this change on:

Copy link
Contributor

@alafanechere alafanechere left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just a partial review as I have to switch context. Will come back at it soon!

airbyte-ci/connectors/pipelines/pipelines/consts.py Outdated Show resolved Hide resolved
airbyte-ci/connectors/pipelines/pipelines/contexts.py Outdated Show resolved Hide resolved
Comment on lines 483 to 484
# Mount /var/lib/docker so docker writes to the host file system, i.e. the dagger engine.
.with_mounted_cache("/var/lib/docker", dagger_client.cache_volume("shared-var-lib-docker"))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The thing I noticed in the past with this kind of caching is that when Dagger kills the service dockerd is not always gracefully exited, it means it can corrupt the /var/lib/docker content and prevent a fine reboot of the service. But it's worth a new try.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In this case all we want is this to be a volume, to avoid storage engine issues in dind. We don't actually care about the caching functionality. Perhaps we can add an rm -rf /var/lib/docker in the with_exec below to nuke the cache?

Copy link
Contributor

@alafanechere alafanechere Sep 13, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

avoid storage engine issues in dind

These were the reason of other errors/warning logs from dockerd right?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes. I think I witnessed instances of this corruption. I'm going to try using a different storage engine, fuse-overlayfs. I have no idea what I'm doing.

Copy link
Contributor

@alafanechere alafanechere left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Super cool readbility boost. Minor suggestion, remarks and improvement question. A lot of these suggestions are basically suggestion about changing things I did in the past that you did not introduce :D
And please: bump the airbyte-ci version in pyproject.toml (we should definitely add a git hook for this).

@alafanechere
Copy link
Contributor

Testing this change on:

source-postgres: https://github.com/airbytehq/airbyte/actions/runs/6174412610/job/16758953944
destination-bigquery: https://github.com/airbytehq/airbyte/actions/runs/6175169779/job/16761436561
destination-snowflake: https://github.com/airbytehq/airbyte/actions/runs/6175187070/job/16761839399

All the test passed 👍 . Just a reminder: the workflow successful execution is not correlated with the test success: you can have correct pipeline execution with failing tests. I checked the test report in the workflow logs. Otherwise a trick is to push an harmless change on a connector file (that you revert after): it will trigger airbyte-ci on the branch and report test statuses.

@postamar
Copy link
Contributor Author

Thanks for the review!

And please: bump the airbyte-ci version in pyproject.toml (we should definitely add a git hook for this).

I'm sorry I keep forgetting this 😬
Instead of a git hook, perhaps a required check in github? Non-empty diff in airbyte-ci/ should require a different version than in master. It can be done.

you can have correct pipeline execution with failing tests.

I noticed! I think I'll do just that.

@github-actions
Copy link
Contributor

destination-snowflake test report (commit 0efd66e578) - ❌

⏲️ Total pipeline duration: 10mn47s

Step Result
Build connector tar
Build destination-snowflake docker image for platform linux/x86_64
Java Connector Unit Tests
Java Connector Integration Tests
Validate airbyte-integrations/connectors/destination-snowflake/metadata.yaml
Connector version semver check
Connector version increment check
QA checks

🔗 View the logs here

☁️ View runs for commit in Dagger Cloud

Please note that tests are only run on PR ready for review. Please set your PR to draft mode to not flood the CI engine and upstream service on following commits.
You can run the same pipeline locally on this branch with the airbyte-ci tool with the following command

airbyte-ci connectors --name=destination-snowflake test

@github-actions
Copy link
Contributor

destination-bigquery test report (commit 0efd66e578) - ❌

⏲️ Total pipeline duration: 06mn43s

Step Result
Build connector tar
Build destination-bigquery docker image for platform linux/x86_64
Java Connector Unit Tests
Java Connector Integration Tests
Validate airbyte-integrations/connectors/destination-bigquery/metadata.yaml
Connector version semver check
Connector version increment check
QA checks

🔗 View the logs here

☁️ View runs for commit in Dagger Cloud

Please note that tests are only run on PR ready for review. Please set your PR to draft mode to not flood the CI engine and upstream service on following commits.
You can run the same pipeline locally on this branch with the airbyte-ci tool with the following command

airbyte-ci connectors --name=destination-bigquery test

@github-actions
Copy link
Contributor

github-actions bot commented Sep 13, 2023

Coverage report for source-postgres

File Coverage [32.18%]
PostgresSourceOperations.java 32.18%
Total Project Coverage 70.6% 🍏

@github-actions
Copy link
Contributor

source-postgres test report (commit 0efd66e578) - ❌

⏲️ Total pipeline duration: 23mn49s

Step Result
Build connector tar
Build source-postgres docker image for platform linux/x86_64
Java Connector Unit Tests
Java Connector Integration Tests
Acceptance tests
Validate airbyte-integrations/connectors/source-postgres/metadata.yaml
Connector version semver check
Connector version increment check
QA checks

🔗 View the logs here

☁️ View runs for commit in Dagger Cloud

Please note that tests are only run on PR ready for review. Please set your PR to draft mode to not flood the CI engine and upstream service on following commits.
You can run the same pipeline locally on this branch with the airbyte-ci tool with the following command

airbyte-ci connectors --name=source-postgres test

@postamar
Copy link
Contributor Author

/approve-and-merge reason="not actually a connector change"

@postamar
Copy link
Contributor Author

Thanks for the review! All is well. Merging.

@octavia-approvington
Copy link
Contributor

This is really good
simply the best

@octavia-approvington octavia-approvington merged commit 25edee4 into master Sep 13, 2023
13 of 18 checks passed
@octavia-approvington octavia-approvington deleted the postamar/airbyte-ci-uses-amazoncorretto branch September 13, 2023 21:44
@github-actions
Copy link
Contributor

destination-snowflake test report (commit c32efc5d33) - ❌

⏲️ Total pipeline duration: 04mn05s

Step Result
Build connector tar
Build destination-snowflake docker image for platform linux/x86_64
Java Connector Unit Tests
Java Connector Integration Tests
Validate airbyte-integrations/connectors/destination-snowflake/metadata.yaml
Connector version semver check
Connector version increment check
QA checks

🔗 View the logs here

☁️ View runs for commit in Dagger Cloud

Please note that tests are only run on PR ready for review. Please set your PR to draft mode to not flood the CI engine and upstream service on following commits.
You can run the same pipeline locally on this branch with the airbyte-ci tool with the following command

airbyte-ci connectors --name=destination-snowflake test

@github-actions
Copy link
Contributor

destination-bigquery test report (commit c32efc5d33) - ❌

⏲️ Total pipeline duration: 02mn50s

Step Result
Build connector tar
Build destination-bigquery docker image for platform linux/x86_64
Java Connector Unit Tests
Java Connector Integration Tests
Validate airbyte-integrations/connectors/destination-bigquery/metadata.yaml
Connector version semver check
Connector version increment check
QA checks

🔗 View the logs here

☁️ View runs for commit in Dagger Cloud

Please note that tests are only run on PR ready for review. Please set your PR to draft mode to not flood the CI engine and upstream service on following commits.
You can run the same pipeline locally on this branch with the airbyte-ci tool with the following command

airbyte-ci connectors --name=destination-bigquery test

@github-actions
Copy link
Contributor

source-postgres test report (commit c32efc5d33) - ❌

⏲️ Total pipeline duration: 25mn48s

Step Result
Build connector tar
Build source-postgres docker image for platform linux/x86_64
Java Connector Unit Tests
Java Connector Integration Tests
Acceptance tests
Validate airbyte-integrations/connectors/source-postgres/metadata.yaml
Connector version semver check
Connector version increment check
QA checks

🔗 View the logs here

☁️ View runs for commit in Dagger Cloud

Please note that tests are only run on PR ready for review. Please set your PR to draft mode to not flood the CI engine and upstream service on following commits.
You can run the same pipeline locally on this branch with the airbyte-ci tool with the following command

airbyte-ci connectors --name=source-postgres test

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants