-
Notifications
You must be signed in to change notification settings - Fork 761
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support for Ollama #616
Comments
Could you please capture the request through F12 Network in the Ollama WebUI, and provide the relevant request headers and body |
It would be nice to use ollama ! Maybe it works with the support of ollama openai api ? Example of settings ? |
After consulting the documentation of ollama, I found that ollama provides APIs compatible with OpenAI API. And for the 403 error, it is actually caused by cross-origin requests. According to https://github.com/ollama/ollama/blob/main/docs/faq.md#how-can-i-allow-additional-web-origins-to-access-ollama, you can set environment variable |
thanks for the tip, filling the model name eg: "gemma:2b" is a requirement for ollama to works, along with: |
Something to note. |
For reference, the issue was discussed here on the ollama issues tracker: For better security, it was suggested here to set the var to the following:
Note that on Windows you need to logout and log back in, or restart the computer, for the environment variable change to take effect. If you want to try asap, you can instead kill ollama, then open a command line terminal and write:
|
Is your feature request related to a problem? Please describe.
新功能是否与解决某个问题相关, 请描述
When trying to have the extension target my local Mistral handled by Ollama, it fails (HTTP 403).
The settings:
I haven't digged into the code yet, but maybe it's because it uses a different API.
Can see here that Ollama WebUI works as expected
and that Ollama is running
Describe the solution you'd like
你期望的新功能实现方案
ChatGPTBox to work with Ollama.
Additional context
其他
Yes, see above :).
The text was updated successfully, but these errors were encountered: