Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Monitor training process with batch hard #28

Open
FSet89 opened this issue Nov 9, 2018 · 1 comment
Open

Monitor training process with batch hard #28

FSet89 opened this issue Nov 9, 2018 · 1 comment

Comments

@FSet89
Copy link

FSet89 commented Nov 9, 2018

When using batch hard, is it correct to say that the hardest negative distance should increase and the hardest positive distance should decrease during training? Is it a good method to monitor the training process?

@paweller
Copy link

paweller commented Jan 28, 2021

Yes, it is true that the inter-class (or negative) distances should increase while the intra-class (or positive) distance should decrease during training. The triplet loss aims to do exactly that.

Regarding the monitoring process you can use the mean inter-class or intra-class distance. However, as those can differ a lot from project to project, you might want to rather use other metrics. These could be a triplet error rate (number_of(d(A,P)-d(A,N) > 0) / number_total_triplets) or the AUC-ROC metric or simply the L2 distance.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants