You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
After running "python3 bin/train.py -C mnist:rc2f2:ptripletN", I find model.config.maxepoch is 8. It's different from the experiment in the paper. It was written that the C2F2 model is trained with the Adam optimizer for 16 epochs. Then I notice the code "self.maxepoch //= 2" of rc2f2 function in configs_rank.py. I try to change the code as follows. It doesn't work, and model.config.maxepoch = 8 . How can I change the epoch into 16?
Hope you can help me.
Thanks and Regards.
The text was updated successfully, but these errors were encountered:
Firstly you can see that dataloader will traverse the whole dataset through __getitem__, and the sample at index will be used as the anchor. Besides, a positive sample is obtained independently, which means we have sampled twice of the data we are expected to sample in every epoch. In that sense, an epoch of SPC-2 data is equivalent to 2 epochs for classification. This is a fixable minor problem, but not necessary to fix.
Just as what I have written in the comment: max epoch means epochs with classification batch, which equals 2 times the epochs with spc-2 batchs. If you really want to change maxepoch, the code snippet on the screenshot is indeed the variable to change. Maybe you have tried to python3 setup.py install before changing the configuration, and your following experiments just imported the old config instead of the new one under your current working directory. python3 setup.py install is not recommended if you want to change the code and do some further development work, or you will have to install again and again after every code change due to python import order.
After running "python3 bin/train.py -C mnist:rc2f2:ptripletN", I find model.config.maxepoch is 8. It's different from the experiment in the paper. It was written that the C2F2 model is trained with the Adam optimizer for 16 epochs. Then I notice the code "self.maxepoch //= 2" of rc2f2 function in configs_rank.py. I try to change the code as follows. It doesn't work, and model.config.maxepoch = 8 . How can I change the epoch into 16?
Hope you can help me.
Thanks and Regards.
The text was updated successfully, but these errors were encountered: