Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ollama models not loading #201

Open
ZeyoYT opened this issue Sep 10, 2024 · 2 comments
Open

Ollama models not loading #201

ZeyoYT opened this issue Sep 10, 2024 · 2 comments

Comments

@ZeyoYT
Copy link

ZeyoYT commented Sep 10, 2024

i am going to use ollama with this, as i dont own openai key or anthropic key, i still added them to the env var in docker compose, but still asking for login via github, and even if i do try to login, it wont allow me to, the github page gives 404

i have it running in production mode, and i also have a domain attached to it, its also behind a basic auth

@ZeyoYT
Copy link
Author

ZeyoYT commented Sep 10, 2024

Update, after looking thru some issues, i removed OPENUI_ENVIRONMENT=production, and it worked, while i cannot access the ui locally thru the ip ( as its running in a remote machine ), i am able to access it with the domain as nginx and openui are running in same machine maybe, but issue is that there are no models being loaded

ollama is running in a remote machine and its accessible because my other apps that uses ollama are working. the env variable i am using in docker for ollama is

OLLAMA_HOST=http://192.168.0.121:11434

@ZeyoYT ZeyoYT changed the title Locally hosted container asking for Login Via github for quota Ollama models not loading Sep 10, 2024
@raman0c17
Copy link

Hi @ZeyoYT,

It seems the issue arises from the OLLAMA_HOST environment variable. Ensure that the IP address that you are using, is accessible from your Docker container.
If the container is on the same host as Ollama,
set OLLAMA_HOST to http://host.docker.internal:11434.
If on a different host, verify network connectivity between the container and the Ollama server.

confirm that the Ollama server is running and listening on port 11434.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants