Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to use code interpreter to save gpt generated documents such as csv files to local disk #879

Closed
scortLi opened this issue Dec 5, 2023 · 6 comments

Comments

@scortLi
Copy link

scortLi commented Dec 5, 2023

I have a requirement to generate an excel table according to some known data and save the table to the local disk, I can not save to the local disk, it is always saved in the directory( /mnt/data/XXX.csv), I guess it should be the directory of the virtual environment. I have an idea to execute the code generated by the code interpreter locally, but I don't know how to execute it or save the file. Is there any other simpler solution?

@qingyun-wu
Copy link
Contributor

A simple solution I can see is to not use docker (set use_docker to False in the code_execution_config argument ). In that case, the program saves generated files to a dir on your local machine. You can either use the default work_dir or specify a specific local directory (via the work_dir field of the code_execution_config argument) to save the generated file to the local disk.

Check code_execution_config in this SDK doc: https://microsoft.github.io/autogen/docs/reference/agentchat/conversable_agent

"work_dir (Optional, str): The working directory for the code execution. If None, a default working directory will be used. The default working directory is the "extensions" directory under "path_to_autogen"."

@scortLi
Copy link
Author

scortLi commented Dec 6, 2023

A simple solution I can see is to not use docker (set use_docker to False in the code_execution_config argument ). In that case, the program saves generated files to a dir on your local machine. You can either use the default work_dir or specify a specific local directory (via the work_dir field of the code_execution_config argument) to save the generated file to the local disk.

Check code_execution_config in this SDK doc: https://microsoft.github.io/autogen/docs/reference/agentchat/conversable_agent

"work_dir (Optional, str): The working directory for the code execution. If None, a default working directory will be used. The default working directory is the "extensions" directory under "path_to_autogen"."
If I call code interpreter in openai's assistant, how do I guarantee to run python code locally and save files such as scv locally

@gagb
Copy link
Collaborator

gagb commented Dec 8, 2023

@lihui123456 can you share a snippet of your code? Which agent class are you using?

@scortLi
Copy link
Author

scortLi commented Dec 8, 2023

@gagb here is my code:
llm_config = {
"config_list": config_list,
"assistant_id": assistant_id,
"tools": [
{
"type": "code_interpreter"
}
],
}
gpt_assistant = GPTAssistantAgent(
name="assistant",
instructions="你是一个代码解释器助手,如果需要使用python代码请用代码来解决问题",
llm_config=llm_config)

user_proxy = UserProxyAgent(
name="user_proxy",
is_termination_msg=lambda msg: "TERMINATE" in msg["content"],
code_execution_config={
"work_dir": "coding",
"use_docker": False, # set to True or image name like "python:3" to use docker
},
human_input_mode="NEVER"
)
user_proxy.initiate_chat(
gpt_assistant,
message="随机生成一个csv文件,并保存在本地",
)

If I execute the above code, the path of the generated CSV file is /mnt/data/XXX.csv, as if in a virtual environment, how can I use the file generated by gpt's assistant to locally or download the file generated in the virtual environment? thank you.

@gagb
Copy link
Collaborator

gagb commented Dec 8, 2023

That's what I thought. This is about GPTAssistantAgent -- this was on the roadmap (#602) but never addressed. I just created an issue for it: #916

@gagb
Copy link
Collaborator

gagb commented Jan 29, 2024

Closing in lieu of #916

@gagb gagb closed this as completed Jan 29, 2024
whiskyboy pushed a commit to whiskyboy/autogen that referenced this issue Apr 17, 2024
* handle num_samples=-1

* comment on num_samples=0
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants