-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Tests/reorganise tests#375 #393
Conversation
Tests have been organised into ones that immediately work with no icat stack and ones that don't.
Codecov ReportBase: 96.90% // Head: 96.89% // Decreases project coverage by
Additional details and impacted files@@ Coverage Diff @@
## main #393 +/- ##
==========================================
- Coverage 96.90% 96.89% -0.02%
==========================================
Files 39 39
Lines 3331 3319 -12
Branches 313 310 -3
==========================================
- Hits 3228 3216 -12
Misses 76 76
Partials 27 27
Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here. ☔ View full report at Codecov. |
@Reillyhewitson Thank you for doing this, it's nice to see the unit and integration tests separated. I just saw this while browsing through the repository and thought I would comment on what I had in mind when I was thinking about splitting the tests a while back. The main reason I wanted to split them was that I only wanted to run the unit tests locally to check if any of my changes are breaking anything and not have to wait for the integration tests to run which also require an ICAT stack with test data. With the unit and integration tests being separate, I also thought about running them as separate jobs in the CI. Is there a command that can be used to only run the unit tests and another that will only run the integration tests? If so, it would be nice if we can have two different jobs in the CI as opposed to one. This way we will not have to set up an ICAT instance and populate it with dummy data if the unit test job fails. |
9c873ba
to
4ebeb77
Compare
Yes, that would be possible, with pytest you can point it at a directory and it will run all the tests in that directory and any directories below. |
I don't think we need to set up a new workflow. We can just rename the existing |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
All the changes you've made look good, would you be able to separate the tests as you and Viktor have discussed please? Just put them into two separate jobs (in the existing workflow file) but have them run parallel like all the workflows do currently. I'm not sure I can see a reason where the integration tests need to wait until the unit tests have passed (happy to be persuaded otherwise though?)
In my opinion, I think that the job that runs the integration tests should only start if the job that runs the unit tests passes so not in parallel. This is because if any of the unit tests fail then the chances of some integration tests failing are high (if not 100%). If we run them in parallel then we will just end up having to wait (+/- 15 minutes) for the integrations test job to run while knowing that it will likely fail. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good, the split is great. I've just realised some of the README needs updating to reflect the changes you've made (i.e. the name of the Nox sessions has changed). Could you update Automated Testing & Other Development Helpers (Nox) and Running Tests please?
I was just looking at the GitHub Action logs to see how long the unit tests take to run and wondered why they take ~1m45s. After looking at it more closely, I realised it is because the |
@VKTB good spot, I didn't realise that. @Reillyhewitson please could you remove this steps that Viktor has linked? They are in the |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As per comment
5a28f97
to
f9eb154
Compare
8318485
to
5f11ba2
Compare
@MRichards99 Nox runs in its own virtual environment which means a main poetry install is still needed for the dummy data. I've moved the poetry install to just before the generator is run. |
This PR will close #375
Description
This PR reorganises the tests into integration tests that require an ICAT stack to be running and unit tests that don't.
Testing Instructions
Add a set up instructions describing how the reviewer should test the code
icatdb Generator Script Consistency Test
CI job fails, is this because of a deliberate change made to the script to change generated data (which isn't actually a problem) or is here an underlying issue with the changes made?fix:
,feat:
orBREAKING CHANGE:
so a release is automatically made via GitHub Actions upon merge?Agile Board Tracking
Connect to #375