Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Updating contribution to aid debugging #961

Merged
merged 2 commits into from
Oct 20, 2020
Merged

Updating contribution to aid debugging #961

merged 2 commits into from
Oct 20, 2020

Conversation

Neeratyoy
Copy link
Contributor

What does this PR implement/fix? Explain your changes.

Adds a NOTE to the contribution guidelines to help debug failing examples.
Currently, if the first failing example switched the server from live to test or vice-versa, and the subsequent example expects the other server, the following examples fail to be built as well.

@Neeratyoy Neeratyoy requested a review from mfeurer October 20, 2020 13:48
CONTRIBUTING.md Outdated
@@ -239,6 +239,9 @@ You may then run a specific module, test case, or unit test respectively:
$ pytest tests/test_datasets/test_dataset.py::OpenMLDatasetTest::test_get_data
```

*NOTE*: In the case the examples build fails during the Continuous Integration test online, please check if
the first failing example file changed the server that the example fetches from.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You mean if the test server is used (or production, depending on the example)? I think it would help to state it explicitly and why this may lead to failure. Why would it only fail on CI, and not locally?

Copy link
Contributor Author

@Neeratyoy Neeratyoy Oct 20, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Updated it. Not sure if a person contributing to the repo will be calling make html to test the examples docs. I thought that it is more likely to see such errors during the Travis run, hence I mentioned the online failure. Also, I haven't myself tested if such errors appear on local builds.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Makes sense. The clarification now makes me understand what kind of problem I am hoping to identify :) looks good to me!

Copy link
Collaborator

@PGijsbers PGijsbers left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

See note

@codecov-io
Copy link

Codecov Report

Merging #961 into develop will decrease coverage by 0.06%.
The diff coverage is n/a.

Impacted file tree graph

@@             Coverage Diff             @@
##           develop     #961      +/-   ##
===========================================
- Coverage    87.67%   87.61%   -0.07%     
===========================================
  Files           37       37              
  Lines         4504     4504              
===========================================
- Hits          3949     3946       -3     
- Misses         555      558       +3     
Impacted Files Coverage Δ
openml/_api_calls.py 87.93% <0.00%> (-2.59%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 0def226...5e2c36c. Read the comment docs.

@mfeurer mfeurer merged commit dde5662 into develop Oct 20, 2020
@mfeurer mfeurer deleted the contrib_edit branch October 20, 2020 16:22
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants