Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

evaluate_performance confusion matrix #167

Closed
SherlockMones opened this issue Jul 27, 2022 · 0 comments
Closed

evaluate_performance confusion matrix #167

SherlockMones opened this issue Jul 27, 2022 · 0 comments
Labels
bug Something isn't working

Comments

@SherlockMones
Copy link
Collaborator

Problem

During predictions, the confusion matrix seems to be broken, if there are no correct classifications for a class. Below example contained 2 samples, none of which was correctly classified.

image

Favored solution

Confusion matrix should show 0.0 for the falsely predicted class.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants