Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make it easy to get seperate "prints" for individual runs/ users when using Transformers Agent #23354

Open
MarcSkovMadsen opened this issue May 14, 2023 · 3 comments
Labels
Feature request Request for a new feature

Comments

@MarcSkovMadsen
Copy link

MarcSkovMadsen commented May 14, 2023

Feature request

I have started exploring the new Transformers Agent. And I would like to build a UI to help me speed up the process.

I might be running multiple runs in parallel or have multiple users using my application. I would like to be able to stream the information from the run as it arrives. I would like to store the information in a database containing all the runs I've done.

Currently all the valuable information about the run is printed I.e. you are using print to inform me like below

==Explanation from the agent==
I will use the following  tool: `image_generator` to generate an image.


==Code generated by the agent==
image = image_generator(prompt="rivers and lakes")


==Result==
<PIL.PngImagePlugin.PngImageFile image mode=RGB size=512x512 at 0x7F8DDC11C4C0>

This is for example done in agenst.py

image

Using print makes it hard for me to distinguish between multiple runs/ users. Especially if run in parallel.

Please provide a simple to use method to stream each run individually. It could be as simple as adding a print (or write) argument to the Agent.run, HFAgent.run and OpenAI.run method.

Alternatively some run_id argument could be provided and printed as well. Then I can split the stream that comes in by run_id. This is less preferred though that this also adds some complexity.

Motivation

This will make it much, much easier to create interesting AI apps.

Your contribution

I might do it 😄 . But I hope someone with knowledge of the code base would do it.

Additional Context

An async .run_async function would also be much appreciated as my UI is built on top of Tornado. This will help me keep the app responsive.

@MarcSkovMadsen
Copy link
Author

The solution for me is probably to inspect the run function and then compose the pieces in a way that works better for my app.

image

@amyeroberts amyeroberts added the Feature request Request for a new feature label May 15, 2023
@amyeroberts
Copy link
Collaborator

cc @sgugger @LysandreJik

@sgugger
Copy link
Collaborator

sgugger commented May 18, 2023

Would the PR mentioned above fix your problem?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Feature request Request for a new feature
Projects
None yet
Development

No branches or pull requests

3 participants