Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

dct_h and dct_w #37

Open
myasser63 opened this issue Aug 27, 2022 · 5 comments
Open

dct_h and dct_w #37

myasser63 opened this issue Aug 27, 2022 · 5 comments

Comments

@myasser63
Copy link

How can I set dct_h and dct_w if i want to add FCA layer into another model. My feature maps for the layer I want to inset Fca layer are 160x160, 80x80, 40x40, 20x20

Please advise.

@cfzd
Copy link
Owner

cfzd commented Aug 29, 2022

@myasser63
You can directly add the FCA layer without any modification. The feature map's size would be addressed automatically as here:

FcaNet/model/layer.py

Lines 54 to 55 in aa5fb63

if h != self.dct_h or w != self.dct_w:
x_pooled = torch.nn.functional.adaptive_avg_pool2d(x, (self.dct_h, self.dct_w))

@myasser63
Copy link
Author

myasser63 commented Aug 30, 2022

So I should leave dct_h and dct_w like that or set to feature maps sizes.

self.FCA = MultiSpectralAttentionLayer(in_channels, self.dct_h, self.dct_w)

@cfzd
Copy link
Owner

cfzd commented Aug 30, 2022

Whatever you want. You can set it according to your preferences or use the settings as ours:

c2wh = dict([(64,56), (128,28), (256,14) ,(512,7)])

self.att = MultiSpectralAttentionLayer(planes * 4, c2wh[planes], c2wh[planes], reduction=reduction, freq_sel_method = 'top16')

@myasser63
Copy link
Author

myasser63 commented Aug 30, 2022

I am trying this way and getting this:

  self.FCA =  MultiSpectralAttentionLayer(c1, c2wh[c1], c2wh[c1])

Error: RuntimeError: adaptive_avg_pool2d_backward_cuda does not have a deterministic implementation, but you set 'torch.use_deterministic_algorithms(True)'. You can turn off determinism just for this operation, or you can use the 'warn_only=True' option, if that's acceptable for your application. You can also file an issue at https://github.com/pytorch/pytorch/issues

@cfzd
Copy link
Owner

cfzd commented Aug 30, 2022

@myasser63
As the error says, it's a problem of adaptive_avg_pool2d. You can either just ignore it by:

torch.use_deterministic_algorithms(True, warn_only=True)

or you can turn off determinism by:

torch.use_deterministic_algorithms(False)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants