-
Notifications
You must be signed in to change notification settings - Fork 113
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use this with local model #129
Comments
Can't without making some changes, we're using: https://python.langchain.com/docs/modules/model_io/chat/structured_output Until then your best bet is a parsing approach, so you'd need to re-write some of the code in the service to use a parsing approach. |
Thanks, I asked question about this function, I could probably copy it from the partner folder. |
I could also create a PR if this is something you want. |
@amztc34283 Were you able to set it up with a local model? I wanna test mistral model through ollama, any idea on this implementation? |
How can I use this with local model?
The text was updated successfully, but these errors were encountered: