-
Notifications
You must be signed in to change notification settings - Fork 554
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
how to export these onnx? #7
Comments
It does not support LSTM with projections. |
@csukuangfj How can I export these lstm models? I don't care it support or not, I just need a workable version |
Please see the doc I posted it before in another thread. |
In this version, models of pruned_transducer_stateless3 are used. In this script, models are exported separately, but projection parts of joiner should also be exported separately. You can use the below function to export joiner
@csukuangfj , I might update onnx exporting script ? |
@EmreOzkose So the model is conformer not lstm? Please update export script, I want have a tried on onnx. |
Yes, please. |
@EmreOzkose Hi, I wanna using wenet Chinese mode, how should I download pretrained model and convert toonnx? Does there any necessary to change thecode? |
Yes, it is not LSTM. I made a PR.
Actually, I did experiments with only English pre-trained model. I have never worked on a Chinese model. If Chinese model doesn't have extra changes in greedy search, I think it will work. |
You can export English models as below
|
This one is for non-streaming Conformer models. It should work for English as well as Chinese models. |
@csukuangfj Hello, I got an error when load wenet model to inference:
do u know why? |
I get vocabsize from token got max + 1 : : vocab size: 5539 but model shape is 5537, why? |
Please show the complete code. It is hard to figure out what goes wrong without seeing the code. |
I tried export lstm-transducer, fails.
The text was updated successfully, but these errors were encountered: