-
Notifications
You must be signed in to change notification settings - Fork 66
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature: Open-webui support #28
Comments
It already works with Ollama, here's an example config: file name: ~/.config/wut/config.json {
"provider": "ollama",
"ollama": {
"host": "http://192.168.8.4:11434",
"default_model": "mistral-nemo:latest"
}
} |
its for ollama which is hosted in localhost. Im talking about open-webui which is hosted on public domain like - samplexyz.com |
If your Ollama instance is reachable over the internet, you can simply swap the URL in Open WebUI’s configuration. Open Web UI, which I also use personally, is essentially just an interface for Ollama. If your Ollama instance is local and Open Web UI is hosted externally (e.g., on samplexyz.com), you’ll need proper network configuration, such as port forwarding or a reverse proxy, to make it accessible. |
I think people use open-webui as public more than ollama.so its just one line for alternative URL. so your project can work with openai's python Library. easiest solution is making baseurl as environment variable. so people can use their costum url with same existing libraries |
as most of the people were using open-webui with ollama and also open-webui is been hosted on public domain. All it require is small changes. here Im attaching my demo that I tried with openai to be more compatible with open-webui.
The text was updated successfully, but these errors were encountered: