-
Notifications
You must be signed in to change notification settings - Fork 54
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
🌱 Add test for fetching bundles #951
🌱 Add test for fetching bundles #951
Conversation
✅ Deploy Preview for olmv1 ready!
To edit notification comments on pull requests, go to your Netlify site configuration. |
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## main #951 +/- ##
=======================================
Coverage 80.14% 80.14%
=======================================
Files 16 16
Lines 1103 1103
=======================================
Hits 884 884
Misses 152 152
Partials 67 67
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. |
Cover what happens when we fail to fetch bundles from catalog. Signed-off-by: Mikalai Radchuk <mradchuk@redhat.com>
76d282b
to
cd4084b
Compare
Adding more test coverage is good, but it seems pretty bad that other e2e tests are non-deterministic in this way. Did you dive into this enough to understand why the flakiness is happening? |
@joelanford Here is what I think happens:
I think error happens because in tests we create catalogs and we do not wait for them to become ready: we create We can look into adding waits after creating |
+100000 to not artificially wait. Expecting eventual consistency in our e2e's is the correct approach. This makes sense. Thanks for the explanation. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
/lgtm
Description
Cover what happens when we fail to fetch bundles from catalog.
This should solve flaky test coverage. It looks like in E2E we sometimes are hitting this condition and sometimes we don't which results in coverage fluctuation:
Reviewer Checklist