Avoid nan loss when there are labels with no samples in the training data. #12
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Hello there.
I ran into problems today when trying to do a test run with training data that lacked samples for one of the labels. This causes the class-balanced focal loss to come out as nan.
currently yields
Adding a safe switch to the Loss class fixes this issue without any changes in weight for the non-zero-sample labels relative to leaving out the zero-sample labels. The loss, however, will come out larger than it would with alternative solution of removing the offending label.
I can see that this is an edge case. But, it will be helpful for me and I imagine it might also be for others. One could also consider raising a ValueError when no-sample labels are supplied hinting at making use of the safe switch.