Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

When test, Why normalize using the whole test dataset's features? #42

Closed
huyangc opened this issue Feb 11, 2018 · 4 comments
Closed

When test, Why normalize using the whole test dataset's features? #42

huyangc opened this issue Feb 11, 2018 · 4 comments

Comments

@huyangc
Copy link

huyangc commented Feb 11, 2018

I think the codes here shows that you normalize each feature on the whole test dataset's features.

It is some tricks or I misunderstood the codes?

@nttstar
Copy link
Collaborator

nttstar commented Feb 11, 2018

This is just for calculating average L2 norm of all test feature embeddings.

@huyangc
Copy link
Author

huyangc commented Feb 11, 2018

thank you! And I have another question about the input image. As the code here shows that the only thing to do is to input the image cropped by detection bounding box, am I right? No need to enlarge the bounding box? Because some tight box only contains the face.

And thank you very much to implement so many loss functions in mxnet, because I am a big fan of mxnet.

@nttstar
Copy link
Collaborator

nttstar commented Feb 11, 2018

some margin is preferred.

@huyangc
Copy link
Author

huyangc commented Feb 11, 2018

Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants