We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
我已经打开了ollama服务,电脑windows设置了环境变量OLLAMA_HOST=0.0.0.0:11434。但是只要打开python -m openui和端口网页后,就会出现下面的报错 WARNING (openui): Couldn't connect to Ollama at https://api.groq.com/openai/v1 WARNING (openui): Couldn't connect to Ollama at 0.0.0.0:11434 我在config.py中看到关于OLLAMA_HOST的设置仍旧保持着OLLAMA_HOST = os.getenv("OLLAMA_HOST", "http://127.0.0.1:11434") 但是为什么报错里面显示的是0.0.0.0:11434不是127.0.0.1:11434 我的电脑因为各种原因无法使用docker这类虚拟机工具,有没有大佬帮我解决一下这个问题
The text was updated successfully, but these errors were encountered:
我也尝试了在powershell里设置$env:OLLAMA_HOST="http://127.0.0.1:11434",以及pip uninstall openui再pip install .,都没有解决
Sorry, something went wrong.
手动改一下代码吧。这块也不知道什么原因。 1.改一下backend/config.py里面得OLLAMA_HOST变量值,不从环境变量取,直接写内网其他电脑的地址,比如http://192.168.1.4:11434 2. 然后改一下server.py文件,把第85行获取ollama客户端对象得代码改成ollama = AsyncClient(host='http://192.168.1.4:11434',timeout=600000) 然后重新启动就行了。
OLLAMA_HOST = os.getenv("OLLAMA_HOST", "http://127.0.0.1:11434/"), 这句话的意思是如果没设置OLLAMA_HOST,它就会设置为http://127.0.0.1:11434。而你设置了OLLAMA_HOST,所以它会用你设置的值。
No branches or pull requests
我已经打开了ollama服务,电脑windows设置了环境变量OLLAMA_HOST=0.0.0.0:11434。但是只要打开python -m openui和端口网页后,就会出现下面的报错
WARNING (openui): Couldn't connect to Ollama at https://api.groq.com/openai/v1
WARNING (openui): Couldn't connect to Ollama at 0.0.0.0:11434
我在config.py中看到关于OLLAMA_HOST的设置仍旧保持着OLLAMA_HOST = os.getenv("OLLAMA_HOST", "http://127.0.0.1:11434")
但是为什么报错里面显示的是0.0.0.0:11434不是127.0.0.1:11434
我的电脑因为各种原因无法使用docker这类虚拟机工具,有没有大佬帮我解决一下这个问题
The text was updated successfully, but these errors were encountered: