Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature: Open-webui support #28

Open
ALIENvsROBOT opened this issue Dec 21, 2024 · 4 comments
Open

Feature: Open-webui support #28

ALIENvsROBOT opened this issue Dec 21, 2024 · 4 comments

Comments

@ALIENvsROBOT
Copy link

as most of the people were using open-webui with ollama and also open-webui is been hosted on public domain. All it require is small changes. here Im attaching my demo that I tried with openai to be more compatible with open-webui.

from openai import OpenAI

base_URL = "https://localhost:3000/api"  # Replace with your base URL can also be "https://somerandomapi.com"
client = OpenAI(base_url = base_URL, api_key="sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxx")


response = client.chat.completions.create(
    model="qwen2.5:7b-instruct",
    messages=[
        {
            "role": "user",
            "content": "Why is the sky blue?"
        }
    ]
)

print(response.choices[0].message.content)

@cybersholt
Copy link

It already works with Ollama, here's an example config:

file name: ~/.config/wut/config.json

{
  "provider": "ollama",
  "ollama": {
    "host": "http://192.168.8.4:11434",
    "default_model": "mistral-nemo:latest"
  }
}

@ALIENvsROBOT
Copy link
Author

It already works with Ollama, here's an example config:

file name: ~/.config/wut/config.json

{
  "provider": "ollama",
  "ollama": {
    "host": "http://192.168.8.4:11434",
    "default_model": "mistral-nemo:latest"
  }
}

its for ollama which is hosted in localhost. Im talking about open-webui which is hosted on public domain like - samplexyz.com

@cybersholt
Copy link

If your Ollama instance is reachable over the internet, you can simply swap the URL in Open WebUI’s configuration. Open Web UI, which I also use personally, is essentially just an interface for Ollama.

If your Ollama instance is local and Open Web UI is hosted externally (e.g., on samplexyz.com), you’ll need proper network configuration, such as port forwarding or a reverse proxy, to make it accessible.

@ALIENvsROBOT
Copy link
Author

I think people use open-webui as public more than ollama.so its just one line for alternative URL. so your project can work with openai's python Library. easiest solution is making baseurl as environment variable. so people can use their costum url with same existing libraries

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants