-
Notifications
You must be signed in to change notification settings - Fork 93
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to use stream to return a message even instead of waiting until everything has been processed #115
Comments
hey @pslxx , there are dedicated method in the ChatInterface like |
@MaximeThoonsen the generateStreamOfText function use guzzle request (not async request), so theoretically the method waits for the whole response from the ollama api. Is there any way to read streamed stream from ollama response directly ? |
+1 Looking to iterate through each stream chunk but stream methods return StreamInterface that doesn't allow this (#78 (comment)) |
If anyone finds this helpful:
|
hello @ezimuel, how are you? It seems there is a lot of questions around streaming. Can we still do streaming with StreamInterface and LLPhant? What is the "clean/simple" working example? @iztok the code your provided is working for you to get a stream? |
Yes, this returns an iterateable stream I can use the same as I used the stream from the OpenAI library. One caveat is that this stream's chunks are not tokens but strings of size 32 bytes. I'm then broadcasting these chunks over the WebSocket to my chat clients. |
@iztok I see how that is a caveat. Does that make any difference in your use case or does it seriously impact the end-user experience? I am trying to understand the pitfalls I might run into while trying to implement something similar. |
with ollama, $stream->read(32) waits on the whole response completed before returning the "chunk" |
Thanks for the hints. However, I must be doing something wrong. This does not work for me and I get all the answer at once.
Using OpenAI, partial reads work perfectly, though.
I'm not sure what else I can test to make answerQuestionStream work. |
Solved. I needed ob_flush() and flush() after each chunk to force Laravel send the output.
|
How to use stream to return a message even instead of waiting until everything has been processed
The text was updated successfully, but these errors were encountered: