Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

I got very low mAP... #20

Open
SpiceGL opened this issue Jun 30, 2020 · 1 comment
Open

I got very low mAP... #20

SpiceGL opened this issue Jun 30, 2020 · 1 comment

Comments

@SpiceGL
Copy link

SpiceGL commented Jun 30, 2020

Hello, I used your code to train for 500 epochs with "python train_new.py --gpu_ids 0 --name ft_ResNet50 --train_all --batchsize 16 --data_dir ./Market". I didn’t change the parameters of the code, but the results are not good. The final result is: Rank@1:0.803147 Rank@5:0.917755 Rank@10:0.944181 mAP:0.614890
Is this because I did not use "--erasing_p 0.5"?

@layumi
Copy link
Owner

layumi commented Jul 21, 2021

@OrchidBrando

It may be the reason. More epochs sometime also can help the performance.

You may consider to add cross-entropy loss for helping the model learning.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants