Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve Codecov config for unexpected coverage changes #621

Merged
merged 2 commits into from
Apr 21, 2023
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 10 additions & 0 deletions codecov.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
coverage:
status:
patch:
default:
target: 100%
threshold: 0.1%
Copy link
Contributor

@vhargrave vhargrave Apr 11, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are we adding this threshold because the coverage might be irregular? Imo we shouldn't be affected by coverage irregularities and allowing a threshold of 0.1% is not the way to fix it.
We're always uploading our coverage reports, our tests should not be time sensitive etc. If we are having irregularities I'd like to investigate what the root cause of them is first before allowing for thresholds.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Imo we shouldn't be affected by coverage irregularities

We are. Look at the report for the PR herein. I added a single non-code file and look how it changed.

Screen Shot 2023-04-11 at 10 46 24

allowing a threshold of 0.1% is not the way to fix it.

It is. I made the change with 100% code coverage on patch preventing threshold from decreasing total coverage over time.

We're always uploading our coverage reports

We are not. Example: https://app.codecov.io/gh/adobecom/milo/commit/8fa7b0aa06758cf0c65d7e43a078de7ee57bd347

Screen Shot 2023-04-11 at 11 05 50

If we are having irregularities I'd like to investigate what the root cause of them is first before allowing for thresholds.

I already have. That is the point of this PR after reading Unexpected Coverage Changes
and conducting my own investigation, with my findings outlined here.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@hparra I've taken a deeper dive and it seems to me that this difference in code coverage is being caused by some of our tests not being idempotent. In my opinion we need to investigate which tests do not always give the same result and fix them instead of allowing for a threshhold of 0.1%. If your test coverage had gone down by 0.13% btw, instead of going up, then codecov would have failed. This threshold therefore seems like a bandage solution to a bigger problem that we have.
On the codecov link that you attached, they describe how to investigate this issue further, but do not describe setting thresholds as an acceptable solution. Reading this other github issue of a team having similar problems, they also had to investigate the root cause of their tests giving different results.
I'm still against setting these arbitrary thresholds.

project:
default:
target: auto
threshold: 0.1%
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd remove the threshhold here too.