Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] cannot link Ollama local serve #4219

Open
3 tasks
lucksufe opened this issue Mar 5, 2024 · 27 comments
Open
3 tasks

[Bug] cannot link Ollama local serve #4219

lucksufe opened this issue Mar 5, 2024 · 27 comments
Labels
bug Something isn't working

Comments

@lucksufe
Copy link

lucksufe commented Mar 5, 2024

Bug Description

cannot link Ollama local serve. Ollama and ChatNext are both latest version. I can run get Ollama response from python script,so the server is OK.

Steps to Reproduce

微信图片_20240305174858

Expected Behavior

微信截图_20240305174824

Screenshots

No response

Deployment Method

  • Docker
  • Vercel
  • Server

Desktop OS

win10

Desktop Browser

No response

Desktop Browser Version

No response

Smartphone Device

No response

Smartphone OS

No response

Smartphone Browser

No response

Smartphone Browser Version

No response

Additional Logs

No response

@lucksufe lucksufe added the bug Something isn't working label Mar 5, 2024
@lucksufe
Copy link
Author

lucksufe commented Mar 5, 2024

[GIN] 2024/03/05 - 21:34:14 | 403 | 0s | 192.168.31.22 | OPTIONS "/v1/chat/completions"
[GIN] 2024/03/05 - 21:34:18 | 403 | 0s | 192.168.31.22 | OPTIONS "/v1/chat/completions"
[GIN] 2024/03/05 - 21:36:58 | 403 | 0s | 192.168.31.22 | OPTIONS "/dashboard/billing/usage?start_date=2024-03-01&end_date=2024-03-06"
[GIN] 2024/03/05 - 21:36:58 | 403 | 0s | 192.168.31.22 | OPTIONS "/dashboard/billing/subscription"

Ollama log shows 403 for nextchat requests

@Alias4D
Copy link

Alias4D commented Mar 5, 2024

error connect to local ollama server

image

image

image

any fix , please

@H0llyW00dzZ
Copy link
Contributor

This is the reason why Ollama is still not stable or fully compatible with this repository, particularly for desktop use. The owner released it without comprehensive testing across various operating systems.

@lucksufe
Copy link
Author

lucksufe commented Mar 6, 2024

Referrer Policy:
strict-origin-when-cross-origin

maybe caused by this policy,but I already create my user variables. OLLAMA_ORIGINS=*://localhost OLLAMA_HOST=0.0.0.0 under the below instructions.

`Setting environment variables on Windows
On windows, Ollama inherits your user and system environment variables.

First Quit Ollama by clicking on it in the task bar

Edit system environment variables from the control panel

Edit or create New variable(s) for your user account for OLLAMA_HOST, OLLAMA_MODELS, etc.

Click OK/Apply to save

Run ollama from a new terminal window`

@H0llyW00dzZ
Copy link
Contributor

Referrer Policy: strict-origin-when-cross-origin

maybe caused by this policy,but I already create my user variables. OLLAMA_ORIGINS=*://localhost OLLAMA_HOST=0.0.0.0 under the below instructions.

`Setting environment variables on Windows On windows, Ollama inherits your user and system environment variables.

First Quit Ollama by clicking on it in the task bar

Edit system environment variables from the control panel

Edit or create New variable(s) for your user account for OLLAMA_HOST, OLLAMA_MODELS, etc.

Click OK/Apply to save

Run ollama from a new terminal window`

it still doesn't work now ?

@H0llyW00dzZ
Copy link
Contributor

Also I don't think so because of strict-origin-when-cross-origin

The strict-origin-when-cross-origin policy strikes a balance between security/privacy and functionality. Here's how it works:

  • Same-origin requests: When a request is made to the same origin, the full URL of the document making the request is sent in the Referer header. This means that for same-origin requests, the behavior is as if the policy were set to no-referrer-when-downgrade (which is the default if no policy is specified).
  • Cross-origin requests: For requests to a different origin, this policy sends only the origin (scheme, host, and port) in the Referer header, omitting the path and query string. This reduces the amount of potentially sensitive information shared across origins.
  • Downgrade navigation: If a website is served over HTTPS and makes a request to an HTTP resource, this policy will still send the origin in the Referer header for cross-origin requests. This is safer than the behavior of no-referrer-when-downgrade, which would send no Referer header at all in this case, as it at least allows the destination site to know which site the request came from.

If you think because of strict-origin-when-cross-origin then ollama it's fucking bad

@lucksufe
Copy link
Author

lucksufe commented Mar 6, 2024

Referrer Policy: strict-origin-when-cross-origin
maybe caused by this policy,but I already create my user variables. OLLAMA_ORIGINS=*://localhost OLLAMA_HOST=0.0.0.0 under the below instructions.
Setting environment variables on Windows On windows, Ollama inherits your user and system environment variables. First Quit Ollama by clicking on it in the task bar Edit system environment variables from the control panel Edit or create New variable(s) for your user account for OLLAMA_HOST, OLLAMA_MODELS, etc. Click OK/Apply to save Run ollama from a new terminal window

it still doesn't work now ?

still 403 forbidden, I also copy and paste the post contents and headers from ChatGPT-Next-Web to python, and it works. The only difference I can see is “Referrer Policy: strict-origin-when-cross-origin” in ChatGPT-Next-Web post request.

I change to llama.cpp to run a serve, Ollama deleted.

@fred-bf
Copy link
Contributor

fred-bf commented Mar 7, 2024

@lucksufe according to you ollama logs it seems to be the NextChat's requests being blocked by the CORS policies. It looks like the env you've set havent take effect in your ollama instance

@kaikanertan
Copy link

I have create the env OLLAMA_ORIGINS to *://localhost
but still meet 403, my system is windows 10.

@Jackxwb
Copy link

Jackxwb commented Mar 7, 2024

ollama API May have been modified. (my ollama version is 0.1.28)
I copied the request from Chrome browser to third-party software in a curl format, and Ollama returned 404

curl 'http://localhost:11434/api/v1/chat/completions' \
  -H 'sec-ch-ua: "Chromium";v="122", "Not(A:Brand";v="24", "Microsoft Edge";v="122"' \
  -H 'DNT: 1' \
  -H 'sec-ch-ua-mobile: ?0' \
  -H 'User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/122.0.0.0 Safari/537.36 Edg/122.0.0.0' \
  -H 'Content-Type: application/json' \
  -H 'Accept: application/json, text/event-stream' \
  -H 'Referer;' \
  -H 'sec-ch-ua-platform: "Windows"' \
  --data-raw '{"messages":[{"role":"user","content":"你好呀"}],"stream":true,"model":"llava:latest","temperature":0.5,"presence_penalty":0,"frequency_penalty":0,"top_p":1}'

PixPin_2024-03-07_23-00-19

Referring to other webui (ollama-webui-lite), it uses the following API for communication

http://localhost:11434/api/tags
http://localhost:11434/api/version

http://localhost:11434/api/chat
http://localhost:11434/api/generate
curl 'http://localhost:11434/api/chat' \
  -H 'Accept: */*' \
  -H 'Accept-Language: zh-CN,zh;q=0.9,en;q=0.8,en-GB;q=0.7,en-US;q=0.6' \
  -H 'Connection: keep-alive' \
  -H 'Content-Type: text/event-stream' \
  -H 'DNT: 1' \
  -H 'Origin: http://localhost:3001' \
  -H 'Referer: http://localhost:3001/' \
  -H 'Sec-Fetch-Dest: empty' \
  -H 'Sec-Fetch-Mode: cors' \
  -H 'Sec-Fetch-Site: same-site' \
  -H 'User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/122.0.0.0 Safari/537.36 Edg/122.0.0.0' \
  -H 'sec-ch-ua: "Chromium";v="122", "Not(A:Brand";v="24", "Microsoft Edge";v="122"' \
  -H 'sec-ch-ua-mobile: ?0' \
  -H 'sec-ch-ua-platform: "Windows"' \
  --data-raw '{"model":"llava:latest","messages":[{"role":"user","content":"你好呀"},{"role":"assistant","content":""}],"options":{}}'

@fred-bf
Copy link
Contributor

fred-bf commented Mar 7, 2024

@Jackxwb Please ensure your Ollama version is greater than v0.1.24 https://docs.nextchat.dev/models/ollama and the endpoint you configured is http://localhost:11434/ it seems you added additionally path /api/

@Jackxwb
Copy link

Jackxwb commented Mar 7, 2024

@Jackxwb Please ensure your Ollama version is greater than v0.1.24 https://docs.nextchat.dev/models/ollama and the endpoint you configured is http://localhost:11434/ it seems you added additionally path /api/

Thank you for your reminder. After modification, I copied the requests from the Chrome browser and now they can work in third-party debugging tools.
But there is still an error in the browser. I set this configuration to the window program, but it still cannot be used
image

image
In the window program, I cannot see the error log

-------- 2024-03-08 16:32(UTC+8) --------
I am using Edge browser, and after adding --disable-web-security to the shortcut, it can be accessed in the browser, but the exe program still reports an error.
Additionally, I found that images can be sent in the exe program, but there is no button for sending images on the web side

-------- 2024-03-08 21:47(UTC+8) --------
After adding OLLAMA_ORIGINS=* to the system environment and restarting the OLLAMA service, I can now access OLLAMA in both edge and exe on my computer
On my Android phone, some browsers are accessible, while others are still not
image
image
image

@z2n
Copy link

z2n commented Mar 14, 2024

先设置一下OLLAMA_ORIGINS, 如果设置了依然无效, 那么可能是是请求的 header 问题
当你使用自定义接口时, 如果只设置了接口地址, 没有设置 API KEY, 那么每次请求时就会把 access code 放在 Authorization 传过去(这个可能还是一个安全问题?)
你可以在使用 ollama 时把 访问密码(access code) 清空来临时解决这个问题.或者等待这个 PR ollama/ollama#2506

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


Set OLLAMA_ORIGINS first. If it still doesn’t work, it may be a request header problem.
When you use a custom interface, if you only set the interface address and do not set the API KEY, then the access code will be passed in Authorization every time you request it (this may still be a security issue?)
You can temporarily solve this problem by clearing the access code when using ollama. Or wait for this PR ollama/ollama#2506

@aaa930811
Copy link

ollama还是使用不了,试过别的chatbox项目可以使用,所以应该不是ollama的配置问题

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


Ollama still can't be used. I tried other chatbox projects and it works, so it shouldn't be a configuration problem with ollama.

@aaa930811
Copy link

image
image

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


image
image

@mintisan
Copy link

我也碰到了,我用 LobeChat 同样的地址设置是可以的

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


I also encountered it. I used LobeChat and the same address settings were ok.

@Alias4D
Copy link

Alias4D commented Mar 25, 2024

Problem solved for nextChat
Just set variables
Ollama host to 0.0.0.0
Ollama origins to *
Set openai end point of 127.0.0.1:11434
Model name to same name of ollama list name

@mcthesw
Copy link

mcthesw commented Apr 4, 2024

OLLAMA_ORIGINS=*works for me

@1101728133
Copy link

None of your methods worked

@daiaji
Copy link

daiaji commented May 4, 2024

先设置一下 OLLAMA_ORIGINS, 如果设置了依然无效, 那么可能是是请求的 header 问题 当你使用自定义接口时, 如果只设置了接口地址, 没有设置 API KEY, 那么每次请求时就会把 access code 放在 Authorization 传过去(这个可能还是一个安全问题?) 你可以在使用 ollama 时把 访问密码(access code) 清空来临时解决这个问题.或者等待这个 PR ollama/ollama#2506

清空NextWeb的访问密码管用,模型名字为ollama list命令输出的那个。
不清空NextWeb的访问密码就不行,这是BUG吗?

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


Set OLLAMA_ORIGINS first. If it is still invalid, it may be a request header problem. When you use a custom interface, if you only set the interface address and not the API KEY, then every request will be access code is passed in Authorization (this may still be a security issue?) You can temporarily solve this problem by clearing the access code (access code) when using ollama. Or wait for this PR ollama/ollama #2506

Clearing the NextWeb access password will work. The model name is the one output by the ollama list command.

@playertk
Copy link

playertk commented May 10, 2024

I tried using Postman monitoring, and compared POST with OPTTONS. The Ollama server only supports POST responses and rejected OPTTONS requests

[GIN] 2024/05/10-10:16:25 | 200 | 8.5950196s | 127.0.0.1 | POST "/v1/chat/completion"
[GIN] 2024/05/10-10:16:02 | 404 | 0s | 127.0.0.1 | Options "/v1/chat/completion"

Is ChatGPTNextWeb configured to change the default access mode to POST?

@coolcoolcloud
Copy link

抓包看了一下 还是CORS跨域的问题 使用curl发Options 请求就不会403 因为默认就没设置origin头
ollama 请求403是由于ChatGPTNextWeb(可能请求用的浏览器发起的所以自带origin头)发起Options请求origin头是 http://tauri.localhost 不在OLLAMA_ORIGINS 这个环境变量(定义的跨域限制响应头 Access-Control-Allow-Origin)里
手动设置一下就可以正常使用了
我用的powershell命令是
$Env:OLLAMA_ORIGINS = 'http://tauri.localhost'

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests