Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pretrained model #18

Open
BenOnoja opened this issue May 26, 2022 · 3 comments
Open

Pretrained model #18

BenOnoja opened this issue May 26, 2022 · 3 comments

Comments

@BenOnoja
Copy link

Actually, I was studying the implementation but could not get the pretrained model.I need to get the Seq2seq_summarization.cpkt file. I downloaded the zip file from the repositories but the pretrained model was not in. Please I need to have it.

@JRC1995
Copy link
Owner

JRC1995 commented May 27, 2022

I don't think I released the pre-trained model (I don't think I even really fully trained it). At this point I don't think I have the checkpoint.

@BenOnoja
Copy link
Author

Ok sir, but do you have an idea of where to get the pretrained model? I came across one but I wasn't sure of it. It is called BERT Summarizer and it's file size is 1.56gb. Can I use it?

@JRC1995
Copy link
Owner

JRC1995 commented May 28, 2022

This repo is an old code originally made before the Transformer Pretraining era. You can use BERT summarizer but that would be unrelated to anything in this repo.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants