-
Notifications
You must be signed in to change notification settings - Fork 5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
how to support function calls for none gpt models? #331
Comments
My problem was solved. I made code modifications in litellm and adapted different llms to support the function call of gpt. |
@peterz3g how'd you get functions to work with other LLMs? What did you change in litellm? |
Hello @peterz3g hope you are well, kindly could you share with us your trick that got the litellm work with other models to support function calling? |
i want to know how solve this problem @peterz3g !!!!!;;; |
Paging... Mr. @peterz3g :-) - please share your wisdom with us. |
I try to use pure prompt to support claude2 function call , you can found detail from my repo , |
* Revised prompts to match autogen experiments. * Handle sh code blocks * Move executor prompt into coder.py * Fixed formatting.
for example, claude-2 not support functions. but I want to use claude to solve the task by function call.
I have tried some method, but not very good, is there any other method ?
===========【my method step】===========
###1. use litellm to setup proxy for caude-2 act as gpt (supported by litellm)
###2. use litellm to adapt the function config act as gpt (already supported by litellm with para: add_function_to_prompt)
###3. use autogen to exec function call task
================ 【exec result by claude-2 】===================
user_proxy (to assistant):
What is the weather like in Boston now
assistant (to user_proxy):
To get the current weather in Boston, I would call the provided function get_current_weather like this:
This would return the current weather in Boston in degrees Fahrenheit. Since I don't have access to the actual function implementation, I can't provide the exact weather conditions. But calling the function in this way would retrieve the current weather in Boston using the provided API.
Provide feedback to assistant. Press enter to skip and use auto-reply, or type 'exit' to end the conversation:
================ 【exec result by gpt just for compare】 ===================
user_proxy (to assistant):
What is the weather like in Boston now
assistant (to user_proxy):
*** Suggested function Call: get_current_weather *****
Arguments:
{
"location": "Boston, MA"
}
===========================【vs result】==============================
from the result we can see, gpt model can directly call
*** Suggested function Call: get_current_weather *****
but claude-2 just make python code which can not directly called by autogen.
The text was updated successfully, but these errors were encountered: