-
Notifications
You must be signed in to change notification settings - Fork 59.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug] cannot link Ollama local serve #4219
Comments
[GIN] 2024/03/05 - 21:34:14 | 403 | 0s | 192.168.31.22 | OPTIONS "/v1/chat/completions" Ollama log shows 403 for nextchat requests |
This is the reason why Ollama is still not stable or fully compatible with this repository, particularly for desktop use. The owner released it without comprehensive testing across various operating systems. |
Referrer Policy: maybe caused by this policy,but I already create my user variables. OLLAMA_ORIGINS=*://localhost OLLAMA_HOST=0.0.0.0 under the below instructions. `Setting environment variables on Windows First Quit Ollama by clicking on it in the task bar Edit system environment variables from the control panel Edit or create New variable(s) for your user account for OLLAMA_HOST, OLLAMA_MODELS, etc. Click OK/Apply to save Run ollama from a new terminal window` |
it still doesn't work now ? |
Also I don't think so because of The
If you think because of |
still 403 forbidden, I also copy and paste the post contents and headers from ChatGPT-Next-Web to python, and it works. The only difference I can see is “Referrer Policy: strict-origin-when-cross-origin” in ChatGPT-Next-Web post request. I change to llama.cpp to run a serve, Ollama deleted. |
@lucksufe according to you ollama logs it seems to be the NextChat's requests being blocked by the CORS policies. It looks like the env you've set havent take effect in your ollama instance |
I have create the env OLLAMA_ORIGINS to *://localhost |
ollama API May have been modified. (my ollama version is 0.1.28)
Referring to other webui (ollama-webui-lite), it uses the following API for communication
|
@Jackxwb Please ensure your Ollama version is greater than v0.1.24 https://docs.nextchat.dev/models/ollama and the endpoint you configured is |
Thank you for your reminder. After modification, I copied the requests from the Chrome browser and now they can work in third-party debugging tools.
-------- 2024-03-08 16:32(UTC+8) -------- -------- 2024-03-08 21:47(UTC+8) -------- |
先设置一下 |
Set |
ollama还是使用不了,试过别的chatbox项目可以使用,所以应该不是ollama的配置问题 |
Ollama still can't be used. I tried other chatbox projects and it works, so it shouldn't be a configuration problem with ollama. |
我也碰到了,我用 LobeChat 同样的地址设置是可以的 |
I also encountered it. I used LobeChat and the same address settings were ok. |
Problem solved for nextChat |
|
None of your methods worked |
清空NextWeb的访问密码管用,模型名字为 |
Clearing the NextWeb access password will work. The model name is the one output by the |
I tried using Postman monitoring, and compared POST with OPTTONS. The Ollama server only supports POST responses and rejected OPTTONS requests [GIN] 2024/05/10-10:16:25 | 200 | 8.5950196s | 127.0.0.1 | POST "/v1/chat/completion"
[GIN] 2024/05/10-10:16:02 | 404 | 0s | 127.0.0.1 | Options "/v1/chat/completion" Is ChatGPTNextWeb configured to change the default access mode to POST? |
抓包看了一下 还是CORS跨域的问题 使用curl发Options 请求就不会403 因为默认就没设置origin头 |
Bug Description
cannot link Ollama local serve. Ollama and ChatNext are both latest version. I can run get Ollama response from python script,so the server is OK.
Steps to Reproduce
Expected Behavior
Screenshots
No response
Deployment Method
Desktop OS
win10
Desktop Browser
No response
Desktop Browser Version
No response
Smartphone Device
No response
Smartphone OS
No response
Smartphone Browser
No response
Smartphone Browser Version
No response
Additional Logs
No response
The text was updated successfully, but these errors were encountered: