Replies: 3 comments
-
>>> lissyx |
Beta Was this translation helpful? Give feedback.
-
>>> p.holetzky |
Beta Was this translation helpful? Give feedback.
-
>>> LearnedVector |
Beta Was this translation helpful? Give feedback.
-
>>> p.holetzky
[January 14, 2018, 1:13pm]
Hello there,
in the
Wiki
it says I can use the model with TensorFlow Serving.
> you can also use the model exported by export directly with TensorFlow
> Serving.
Is this information correct? I found comments that serving is no longer
supported in the Github Issues section.
I created a simple websocket/bottlepy based server based on the
deepspeech-server project for real-time STT and while it works nicely
with a single client, I am wondering how to allow inference for multiple
users at the same time using one model. slash
If I understand correctly TensorFlow Serving would be the answer?
> Servables are the central abstraction in TensorFlow Serving. Servables
> are the underlying objects that clients use to perform computation
> (for example, a lookup or inference).
Thanks for the help
![:slight_smile:](
[This is an archived TTS discussion thread from discourse.mozilla.org/t/serving-a-model-trained-with-deepspeech]
Beta Was this translation helpful? Give feedback.
All reactions