Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

all FxBx(s/4) #16

Merged
merged 1 commit into from
Aug 27, 2019
Merged

all FxBx(s/4) #16

merged 1 commit into from
Aug 27, 2019

Conversation

zzx-zhangxian
Copy link
Collaborator

all cell's channel are all FxBx(s/4), maybe

@HankKung
Copy link
Collaborator

Seems great! Have you tested it to check if it works fine?

@zzx-zhangxian
Copy link
Collaborator Author

Seems great! Have you tested it to check if it works fine?

I test it for 19 epochs now, and the miou is shown in figure. It looks like the miou and pixel accuracy is not stable, do you have some advice?
image

@zzx-zhangxian
Copy link
Collaborator Author

by the way, when I retrain the autodeeplab in cityscapes just like the result searched by paper, Result is also not stable.....

@HankKung
Copy link
Collaborator

Did you set the image size and crop size as same as the numbers on paper both when search and retrain?

@zzx-zhangxian
Copy link
Collaborator Author

Did you set the image size and crop size as same as the numbers on paper both when search and retrain?

I set 384 and 320.... Now I don't know how to apply size with 321

@HankKung
Copy link
Collaborator

does new_model.py file also adopt new filter config and set F as 20?
I have this problem that can not set crop size to an odd number as well...
I'll test it again with the properly resized image and your change.

@zzx-zhangxian
Copy link
Collaborator Author

does new_model.py file also adopt new filter config and set F as 20?
I have this problem that can not set crop size to an odd number as well...
I'll test it again with the properly resized image and your change.

No, filter in new_model.py is still wrong, I haven't reproduce the params and flops in paper(695B and 44.42M), if you have some ideas, please tell me.
Also, i cann't reproduce the 1551.05B and 43.48M deeplab v3 too, My version is 1478B and 45M

@HankKung HankKung merged commit aca71fa into NoamRosenberg:master Aug 27, 2019
@NoamRosenberg
Copy link
Owner

@zhizhangxian @HankKung at this point our version could be more optimized than the authors. I’ve made a lot of efforts to share weights when possible, so let’s not worry about reproducing flops, let’s get the Same or better results with the same or less memory usage and the same or lesss speed given the same GPU.

@zzx-zhangxian zzx-zhangxian deleted the test2 branch August 31, 2019 04:56
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants