Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Loss goes to 0 but embeddings are bad #27

Open
FSet89 opened this issue Nov 6, 2018 · 2 comments
Open

Loss goes to 0 but embeddings are bad #27

FSet89 opened this issue Nov 6, 2018 · 2 comments

Comments

@FSet89
Copy link

FSet89 commented Nov 6, 2018

I trained a network with the triplet loss over a small dataset with 6 different identities and about 1000 total images. I noticed that the loss goes to 0 after 100 steps. However, the performances are not good: the embeddings seem just noise. I used a small learning rate (1e-4) and a simple network with 6 convolutions (conv+BN+relu) and 3 pooling. This happens both with batch hard and batch all. What could be the problem?

@omoindrot
Copy link
Owner

What are you measuring when you say "performance is not good"?

If you are looking at embeddings on a separate test set, it is possible that the model overfits with 6 identities. Maybe you could add regularization or increase the number of identities / images?

@FSet89
Copy link
Author

FSet89 commented Nov 8, 2018

It seems that the embeddings are very similar, no matter what image is fed. I will try to increase the dataset.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants