You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I run inference model like that: python3.7 -m nmt --out_dir=../BestModel/sign2text/ -- inference_input_file=../Data/phoenix2014T.test.sign --inference_output_file=../Data/predictions.de --inference_ref_file=../Data/phoenix2014T.test.de --base_gpu=0
Then i have following error: nslt/nslt/nmt.py", line 244, in extend_hparams raise ValueError("hparams.vocab_prefix must be provided.") ValueError: hparams.vocab_prefix must be provided.
In arguments i found this line: parser.add_argument("--vocab_prefix", type=str, default=None, help="""\ Vocab prefix, expect files with src/tgt suffixes.If None, extract from train files.\ """)
Where do i find this vocab_prefix or folder or whatever?
The text was updated successfully, but these errors were encountered:
Under nslt/nslt/ folder, I ran the command below and it worked for me. python -m nmt --out_dir=<trained_model_dir> --inference_input_file=../Data/phoenix2014T.test.sign --inference_output_file=../Data/predictions_test.de --vocab_prefix=../Data/phoenix2014T.vocab --inference_ref_file=../Data/phoenix2014T.test.de --base_gpu=0
I run inference model like that:
python3.7 -m nmt --out_dir=../BestModel/sign2text/ -- inference_input_file=../Data/phoenix2014T.test.sign --inference_output_file=../Data/predictions.de --inference_ref_file=../Data/phoenix2014T.test.de --base_gpu=0
Then i have following error:
nslt/nslt/nmt.py", line 244, in extend_hparams raise ValueError("hparams.vocab_prefix must be provided.") ValueError: hparams.vocab_prefix must be provided.
In arguments i found this line:
parser.add_argument("--vocab_prefix", type=str, default=None, help="""\ Vocab prefix, expect files with src/tgt suffixes.If None, extract from train files.\ """)
Where do i find this vocab_prefix or folder or whatever?
The text was updated successfully, but these errors were encountered: