-
Notifications
You must be signed in to change notification settings - Fork 27
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Not able to load encoder #10
Comments
Could you please share the steps before this line? It seems like you are trying a different model in some manner, this is indicated by the difference in dimensions: RuntimeError: While copying the parameter named encoder.weight, whose dimensions in the model are torch.Size([67979, 300]) and whose dimensions in the checkpoint are torch.Size([25704, 300]). i.e. |
I am following lesson 4 notebook from sentiment analysis part. which starts with loading vocab file. t = splits[0].examples[111]
t.label, ' '.join(t.text[:19]) output: ('hotel', 'हम्पी में 3 दिसंबर से 3 दिन 2 रात के लिए एक कमरा बुक करें') which is working same as shown in lesson 4 md2 = TextData.from_splits(PATH, splits, bs)
m3 = md2.get_model(opt_fn, 1500, bptt, emb_sz=em_sz, n_hid=nh, n_layers=nl,
dropout=0.1, dropouti=0.4, wdrop=0.5, dropoute=0.05, dropouth=0.3)
m3.reg_fn = partial(seq2seq_reg, alpha=2, beta=1)
m3.load_encoder(f'adam1_enc') initially I used vocab and encoder provided by you but due above error I trained my language model by following your notebook but again got stuck at |
anything yet |
Hey @008karan, as a solo dev - I don't have the bandwidth to support this older version while building out newer version. If you'd be so kind to have some patience, the newer version will have an end to end demo. |
If possible can you just check that are you able to load encoder or not. |
Did you find the solution to it ? @NirantK |
getting error while loading encoder:
m3.load_encoder(f'adam1_enc')
I was able reproduce the notebook given by you but again getting the same error.
Not facing this issue when I load encoder from lesson 4 notebook which is for imdb data.
Can you look through it, otherwise people will not be able to use your pretrained LM
The text was updated successfully, but these errors were encountered: