Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Don't fail empty results if failOnError is False #220

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

Wilsontomass
Copy link

@Wilsontomass Wilsontomass commented Dec 19, 2022

This also changes how the check is created, failing the check on failing tests. This resolves #217 and also resolves #161.

If people would actually still like the check itself to pass on failing tests, I would recommend adding that as an additional parameter to the action (failCheckOnFailure ?), but I don't trust my own TS skills enough to do that. However, my guess is that #214 was a misunderstanding of what #161 wanted to do, and not many people actually want what #214 adds.

This also changes how the check is created, failing the check on failing tests.
This resolves dorny#217  and also resolves dorny#161.
@flobernd
Copy link

Is it even a failure if no test results have been found? It should rather be a warning in my opinion.

This behavior already caused issues for me in the past, because I'm using a reusable workflow for my CI purposes. One of the projects did not have any tests (because it was just an autogenerated wrapper for something else) and the test report step failed because of this. On the other hand, I definitely want "failOnError" enabled in general for my projects.

My proposal is to change this error to a warning regardless of the "failOnError" value (cc @dorny). Not even sure if it's worth increasing complexity with a new "failIfNoTestsFound" input, but that might be something to consider if one really need to fail in this case. I don't see a valid usecase for this at the moment, but maybe I'm missing a valid point.

@Wilsontomass
Copy link
Author

When you say "this behavior", which do you mean?

We have two kinds of failures we can report on here.

  1. Failure of the action
  2. Failure of a test in the test report.

And we have two places we can report them

  1. In the action step
  2. On the check

I believe its best to keep these clearly separate. 1 with 1 and 2 with 2. Then, For example, if no test results are found, because perhaps the file name was misspelled or so, i think it is good if the action marks itself as failed. Then, if you're unhappy with that behavior, its best to use the continue-on-error flag available to any GitHub workflow step.

For case 2-2, that is where you may need an additional flag to control when you want that to fail.

I actually just tried continue-on-error and it works for my use case, so, with "failOnError" set to true, i dont need this PR any more. So this PR can be closed but, it may still be worth thinking about.

@flobernd
Copy link

For example, if no test results are found, because perhaps the file name was misspelled or so, i think it is good if the action marks itself as failed. Then, if you're unhappy with that behavior, its best to use the continue-on-error flag available to any GitHub workflow step.

This is the behavior I slightly disagree with 😓 If the file is misspelled, this might indicate a bug in the CI workflow, but might as well be perfectly fine (like in the scenario I described in my previous comment). Therefore I think we should change this to a "warning".

I can live with using the continue-on-error option, but this will as well suppress actual errors (e.g. sometimes some API calls do fail).

@j-catania j-catania added the bug Something isn't working label Sep 22, 2023
@j-catania
Copy link
Collaborator

Hi guys,
thanks both for this discussion.

Is any of you changed his mind ?

@j-catania
Copy link
Collaborator

Guys,

I merge this PR #243.
It's with choice. Anyone is happy :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
3 participants