Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Received error from server (status 9) Error message: No decoder available, try again later #85

Open
h770347 opened this issue Jan 5, 2022 · 1 comment

Comments

@h770347
Copy link

h770347 commented Jan 5, 2022

C:\Users\SA>docker run --memory="8g" -it -p 8080:80 -v /media/kaldi_models:/opt/models ca9ada4fab7f /bin/bash
root@b4eabf42047c:/opt# /opt/start.sh -y /opt/models/nnet2.yaml

python main.py -u ws://localhost:8080/client/ws/speech -r 8192 .\test_data_bill_gates-TED.mp3
Received error from server (status 9)
Error message: No decoder available, try again later

@h770347
Copy link
Author

h770347 commented Jan 5, 2022

/nnet2.yaml You have to download TEDLIUM "online nnet2" models in order to use this sample

Run download-tedlium-nnet2.sh in '/opt/models' to download them.

use-nnet2: True
decoder:
# All the properties nested here correspond to the kaldinnet2onlinedecoder GStreamer plugin properties.
# Use gst-inspect-1.0 ./libgstkaldionline2.so kaldinnet2onlinedecoder to discover the available properties
use-threaded-decoder: true
model : /opt/models/english/tedlium_nnet_ms_sp_online/final.mdl
word-syms : /opt/models/english/tedlium_nnet_ms_sp_online/words.txt
fst : /opt/models/english/tedlium_nnet_ms_sp_online/HCLG.fst
mfcc-config : /opt/models/english/tedlium_nnet_ms_sp_online/conf/mfcc.conf
ivector-extraction-config : /opt/models/english/tedlium_nnet_ms_sp_online/conf/ivector_extractor.conf
max-active: 10000
beam: 10.0
lattice-beam: 6.0
acoustic-scale: 0.083
do-endpointing : true
endpoint-silence-phones : "1:2:3:4:5:6:7:8:9:10"
traceback-period-in-secs: 0.25
chunk-length-in-secs: 0.25
num-nbest: 10
#Additional functionality that you can play with:
#lm-fst: /opt/models/english/tedlium_nnet_ms_sp_online/G.fst
#big-lm-const-arpa: /opt/models/english/tedlium_nnet_ms_sp_online/G.carpa
#phone-syms: /opt/models/english/tedlium_nnet_ms_sp_online/phones.txt
#word-boundary-file: /opt/models/english/tedlium_nnet_ms_sp_online/word_boundary.int
#do-phone-alignment: true
out-dir: tmp

use-vad: False
silence-timeout: 10

Just a sample post-processor that appends "." to the hypothesis

post-processor: perl -npe 'BEGIN {use IO::Handle; STDOUT->autoflush(1);} s/(.*)/\1./;'

A sample full post processor that add a confidence score to 1-best hyp and deletes other n-best hyps

#full-post-processor: ./sample_full_post_processor.py

logging:
version : 1
disable_existing_loggers: False
formatters:
simpleFormater:
format: '%(asctime)s - %(levelname)7s: %(name)10s: %(message)s'
datefmt: '%Y-%m-%d %H:%M:%S'
handlers:
console:
class: logging.StreamHandler
formatter: simpleFormater
level: DEBUG
root:
level: DEBUG
handlers: [console]

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant