Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Custom binary_crossentropy that penalizes one mispredicted class #3068

Closed
vabatista opened this issue Jun 25, 2016 · 4 comments
Closed

Custom binary_crossentropy that penalizes one mispredicted class #3068

vabatista opened this issue Jun 25, 2016 · 4 comments

Comments

@vabatista
Copy link

Hi, I have a dataset with 95% of data within one class and 5% of another. My models are having 94% of accuracy, but the model predicts 100% the prevalent class. My best roc_auc_score is about 0.71 on validation set.

Can anyone suggest a good custom objetive function that solves this problem?

@ChristianThomae
Copy link

You may use class_weight with inverse class frequencies when calling fit, see the docs.

@Yingyingzhang15
Copy link

@ChristianThomae Can you give an example of how to use the class-weight? I mean do I need to pass an array or something else to the fitfunction?

@tboquet
Copy link
Contributor

tboquet commented Jul 14, 2016

You could also take a look at #2115 if you want an example of a loss function where you can weight the different cases of a confusion matrix.

@Yingyingzhang15
Copy link

@tboquet Thank you! It seems that I have solved this problem.The class weight should be a dictionary as explained in docs.

@stale stale bot added the stale label May 23, 2017
@stale stale bot closed this as completed Jun 22, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants