-
Notifications
You must be signed in to change notification settings - Fork 302
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature Request] Function calling #507
Comments
function call is for developers, not end users. Currently the only promising form of extending model functionality is gpts. Technically, it is possible to implement a command line version of gpt through function call, but at this stage, there are neither users nor suppliers, so there is no chance. |
I am not you what do you mean by only for developers. Certainly I can use both I mentioned as a user, because the model directly decides which function to call during a standard conversation. |
It seems to me that the best use of function calling is to make the call via cli and pipe in the results to aichat. One use case that I hope evolves is the ability to use aichat as a remote control over an ollama-type system with enhanced tools capabilities. Is this in line with your comment @gilcu3 ? |
@cboecking That is one possibility, although the usage I was suggesting is much simpler. When interacting with the bot it might realize it needs to search something online, or solve some math formula, and in such cases it automatically uses function calling to reply to the user. In the background |
tools are something you pass to the base model in a structured form, e.g. for OpenAI https://platform.openai.com/docs/api-reference/chat/create#chat-create-tools, and the API handles whether to return a chat message or request a tool invocation. it would make the most sense to allow the user to define tools and their interfaces in config.yaml, specify if the current model supports tool usage (which most OpenAI compatible API wrappers support, e.g. Huggingface TGI: https://huggingface.co/docs/text-generation-inference/en/basic_tutorials/using_guidance#chat-completion-with-tools), and if the completion requests a tool call then invoke the tool after prompting the user. example config.yaml, in pseudocode:
in a chat:
and re: users vs developers, in many senses users of aichat are sort of developers too! |
#514 has implemented this function, you can try it. Welcome feedback and suggestions. |
Feature working perfectly, thank you! Will you accept PRs to add more example tools to llm-functions? I tested specifically with Wolfram Alpha factoring a big number... |
I welcome you to submit a PR @gilcu3 |
There are two function types:
Now the question is how to distinguish them?
What are your suggestions? |
I would go for either of the first 2 options, leaning towards the second. For the moment I cannot imagine any other type of function. For retrieve functions then it would make sense to keep their inputs and outputs as part of the session conversation. |
AIChat now fully supports the function calling feature.
|
Do you plan supporting the function calling feature? Would you be open to accept PRs on that regard?
Currently, some functions such as DuckDuckGo search or Wolfram Alpha could greatly extend the model's capabilities.
The text was updated successfully, but these errors were encountered: