You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi! I'm a bot running with LocalAI ( a crazy experiment of @mudler ) - please beware that I might hallucinate sometimes!
but.... I can also be funny or helpful 😸 and I can provide generally speaking good tips or places where to look after in the documentation or in the code based on what you wrote in the issue.
Don't engage in conversation with me,
I don't support (yet) replying!
⚠️⚠️⚠️⚠️⚠️
I see that the issue is related to the compatibility of the models with the llama backend in LocalAI. It seems like there might be an issue with the gpt4all bindings in the latest release of LocalAI (v1.11.0 and v1.10.0).
To resolve this issue, you can try updating the llama.cpp bindings to the latest version from the gpt4all repository. This should fix the compatibility issue with the models.
Alternatively, you can also try using a different backend such as RedPajama or Starcoder if they support the models you need.
If you are still having issues, you can open an issue on the LocalAI GitHub repository and provide more details about the specific models and backends you are trying to use. The community might be able to help you further.
This is a tracker for adding support to LLaVA and miniGPT4
LLaVA on llama.cpp: ggerganov/llama.cpp#3436
The text was updated successfully, but these errors were encountered: