-
Notifications
You must be signed in to change notification settings - Fork 951
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implementing Streaming Output with Autogen and Chainlit Integration #589
Comments
Does autogen support streaming in the first place? If so can you link the docs? |
Ref here: microsoft/autogen#217 (comment) |
Interesting! I see a code snippet to enable streaming but I don't see an example to consume the stream? |
Based on my understanding, right now Autogen supports streaming the response from llm, but not the completed answer (the message one agent sent to another). |
I recently experimented with using monkey patching to implement streaming functionality. This method enables the dynamic and flexible addition of new Agents without the need to alter the original autogen source code. Note: You can isolate the monkey patching method into its own separate library. Here's an example of the successful code I developed. If this approach seems viable, I'm willing to propose a pull request.
|
Coming back to this: since release 0.2.21 supports import chainlit as cl
from typing import Any
from autogen.io.base import IOStream
class ChainlitIOStream(IOStream):
def __init__(self, author):
self.author = author
self.message = cl.Message(author=author)
def print(self, *args: Any, **kwargs) -> None:
return self.message.stream_token(args)
def input(self, prompt: str = "", *, password: bool = False) -> str:
return cl.AskUserMessage(prompt).send()
default_io_stream = ChainlitIOStream(author="MyBot")
def set_custom_IO_overrides():
IOStream.set_global_default(default_io_stream) Then when the # If content is present, print it to the terminal and update response variables
if content is not None:
iostream.print(content, end="", flush=True)
response_contents[choice.index] += content
completion_tokens += 1
else:
# iostream.print()
pass Ps. @willydouhard yes, Chainlit supports streaming |
I have a question regarding an example on https://github.com/Chainlit/cookbook/blob/main/pyautogen/app.py (or async version: https://github.com/Chainlit/cookbook/blob/main/pyautogen/async_app.py) that combines autogen and chainlit. I've been trying to modify the code to enable streaming output, but so far, no results have appeared on the screen. My code is as follows:
Is there a way to implement streaming with autogen in a chainlit environment?
The text was updated successfully, but these errors were encountered: