Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

无法连接ollama #192

Open
dandandujie opened this issue Aug 15, 2024 · 3 comments
Open

无法连接ollama #192

dandandujie opened this issue Aug 15, 2024 · 3 comments

Comments

@dandandujie
Copy link

我已经打开了ollama服务,电脑windows设置了环境变量OLLAMA_HOST=0.0.0.0:11434。但是只要打开python -m openui和端口网页后,就会出现下面的报错
WARNING (openui): Couldn't connect to Ollama at https://api.groq.com/openai/v1
WARNING (openui): Couldn't connect to Ollama at 0.0.0.0:11434
我在config.py中看到关于OLLAMA_HOST的设置仍旧保持着OLLAMA_HOST = os.getenv("OLLAMA_HOST", "http://127.0.0.1:11434")
但是为什么报错里面显示的是0.0.0.0:11434不是127.0.0.1:11434
我的电脑因为各种原因无法使用docker这类虚拟机工具,有没有大佬帮我解决一下这个问题

@dandandujie
Copy link
Author

我也尝试了在powershell里设置$env:OLLAMA_HOST="http://127.0.0.1:11434",以及pip uninstall openui再pip install .,都没有解决

@markofrain
Copy link

手动改一下代码吧。这块也不知道什么原因。
1.改一下backend/config.py里面得OLLAMA_HOST变量值,不从环境变量取,直接写内网其他电脑的地址,比如http://192.168.1.4:11434
2. 然后改一下server.py文件,把第85行获取ollama客户端对象得代码改成ollama = AsyncClient(host='http://192.168.1.4:11434',timeout=600000) 然后重新启动就行了。

@1WorldCapture
Copy link

OLLAMA_HOST = os.getenv("OLLAMA_HOST", "http://127.0.0.1:11434/"), 这句话的意思是如果没设置OLLAMA_HOST,它就会设置为http://127.0.0.1:11434。而你设置了OLLAMA_HOST,所以它会用你设置的值。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants