-
Notifications
You must be signed in to change notification settings - Fork 1.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Model loading problem with Ollama #218
Comments
Same exact issue here. pre-existing Ollama install. |
Hello, Inside the frontend it is used in several places :
It can be fixed by updating the frontend Settings.tsx. a PR has been created for the settings |
Nice, thanks for the fix @alexandregodard . Hope your fix finds its way to the app soon, so i can update the tool and try again :) |
Hi there,
I am using openui out of pinokio (https://pinokio.computer/item?uri=https://github.com/pinokiofactory/openui).
As a LLM backend, i'm using Ollama in its current version 0.4.6.
If i want to try your tool, i get the error message "Error! 404 Error code: 404 - {'error': {'message': 'model "undefined" not found, try pulling it first', 'type': 'api_error', 'param': None, 'code': None}}" after sending a prompt.
As i try to set a different model, i noticed in the settings window, that the select box does not show any model names but only empty entries:
It does not matter, which of those entries i choose, the error persists.
If i quit Ollama and try to resolve the installed models, the seleciton is empty:
So the model resolution from Ollama seems to work at least partially (6 entries are correct, according to the 6 currently installed models).
My guess is, that openui is not able to resolve the information from the Ollama model list request correctly and further on, this leads to the upper error message.
Do you have any ideas, to solve this problem?
Thx :)
The text was updated successfully, but these errors were encountered: