-
Notifications
You must be signed in to change notification settings - Fork 27.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
pretrained model #164
Comments
All the code related to word embeddings is located there https://github.com/huggingface/pytorch-pretrained-BERT/blob/8da280ebbeca5ebd7561fd05af78c65df9161f92/pytorch_pretrained_bert/modeling.py#L172-L200 If you want to access pretrained embeddings, the easier thing to do would be to load a pretrained model and extract its embedding matrices. |
oh I have seen this code these days . and from this code I think it dose not use the pretrained embedding paras , and what do you mean by load and extract a pretrained model ???? Is it from the original supplies |
In [1]: from pytorch_pretrained_bert import BertModel
In [2]: model = BertModel.from_pretrained('bert-base-uncased')
In [3]: model.embeddings.word_embeddings
Out[3]: Embedding(30522, 768) This field of the |
Thanks Gregory that the way to go indeed! |
Add support for BaiChuan model
is the pretrained model downloaded include word embedding?
I do not see any embedding in your code
please
The text was updated successfully, but these errors were encountered: