Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Model loading problem with Ollama #218

Open
RaemyS opened this issue Dec 1, 2024 · 4 comments
Open

Model loading problem with Ollama #218

RaemyS opened this issue Dec 1, 2024 · 4 comments

Comments

@RaemyS
Copy link

RaemyS commented Dec 1, 2024

Hi there,

I am using openui out of pinokio (https://pinokio.computer/item?uri=https://github.com/pinokiofactory/openui).
As a LLM backend, i'm using Ollama in its current version 0.4.6.

If i want to try your tool, i get the error message "Error! 404 Error code: 404 - {'error': {'message': 'model "undefined" not found, try pulling it first', 'type': 'api_error', 'param': None, 'code': None}}" after sending a prompt.

As i try to set a different model, i noticed in the settings window, that the select box does not show any model names but only empty entries:
Image

It does not matter, which of those entries i choose, the error persists.

If i quit Ollama and try to resolve the installed models, the seleciton is empty:
Image

So the model resolution from Ollama seems to work at least partially (6 entries are correct, according to the 6 currently installed models).
My guess is, that openui is not able to resolve the information from the Ollama model list request correctly and further on, this leads to the upper error message.

Do you have any ideas, to solve this problem?

Thx :)

@hashemAlsaggaf
Copy link

Same exact issue here. pre-existing Ollama install.

@alexandregodard
Copy link

Hello,
Miss the name property inside the json returned by the lib ollama in python.
Even if the ollama endpoint http://localhost:11434/api/tags return
Image.
the endpoint on his side remove it http://127.0.0.1:7878/v1/models
Image.

Inside the frontend it is used in several places :

  • frontend/src/components/Settings.tsx:109-232-233

It can be fixed by updating the frontend Settings.tsx.
A q&d solution is to update the object provided by the backend in server.py (function get_ollama_models)

a PR has been created for the settings

@RaemyS
Copy link
Author

RaemyS commented Dec 8, 2024

Nice, thanks for the fix @alexandregodard .
Unfortunately, i'm not skilled enough with the current tech stack, so i'm not able to compile and test the changes locally without further research.

Hope your fix finds its way to the app soon, so i can update the tool and try again :)

@Fox-Me
Copy link

Fox-Me commented Dec 20, 2024

Please, how to solve it. I'm having the same issue with ollama:

Image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants