Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to load the trained model for inference #5

Open
sreeramjoopudi opened this issue May 10, 2021 · 0 comments
Open

Unable to load the trained model for inference #5

sreeramjoopudi opened this issue May 10, 2021 · 0 comments

Comments

@sreeramjoopudi
Copy link

I have been using allennlp for the last one year and I have successfully trained & ran-inference on these models through config files. Recently, I wanted to train & load models without the usage of config files. I was successfully able to train a model by using allennlp as a library. However, when I tried to load this model in a separate process/separate python script (for inference), I ran into an issue of missing config.json file. The load_archive method of allennlp.models.archival is throwing a missing config.json file error when I point to the output of trained model directory. Can you tell us

  • if this is expected and the way to overcome this is to create a config.json on our own (I believe, it should be possible to create my own config.json if needed for running inference) (or)

  • is there any other way in which I should load the trained model

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant