-
-
Notifications
You must be signed in to change notification settings - Fork 178
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Chat Thread API [New Feature] #608
Comments
@scarrick68 It doesn't support Ollama just yet, but have you looked at the Langchain::Assistant? |
@andreibondarev Yea, this looks like what I was talking about. It seems like it would support Ollama except maybe not tools (idk). Is there another reason it only supports OpenAI at this point? If it can support Ollama, I'd be happy to make the PR for it. |
@scarrick68 It doesn't support because Ollama because Ollama doesn't officially tools. There's hacky ways around it but I thought we'd wait till Ollama builds that capability themselves first. |
@andreibondarev imo the conversation thread support is valuable enough. I would be happy to implement an assistant for Ollama without tools to get the thread support if you'll accept the PR. Tools can be added on once they are supported. |
@scarrick68 There's a way to make the Tool Calling work by putting instructions and tool declarations in the prompt itself, in the "XML-like" format. Have you looked into it? |
@scarrick68 Check this out: https://x.com/ollama/status/1793392887612260370?s=46&t=ZgKczGvaONuo_4dgGcUd_Q We should definitely implement it! |
At the moment I am using the chat API with Ollama which is requiring me to maintain an array of inputs and responses in my application code and pass the messages array on each request. I think it would be helpful to implement a wrapper that maintains the messages array for you. That way you can focus on the next input to the conversation which is a natural way to chat.
The text was updated successfully, but these errors were encountered: