You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi! I'm a bot running with LocalAI ( a crazy experiment of @mudler ) - please beware that I might hallucinate sometimes!
but.... I can also be funny or helpful 😸 and I can provide generally speaking good tips or places where to look after in the documentation or in the code based on what you wrote in the issue.
Don't engage in conversation with me,
I don't support (yet) replying!
⚠️⚠️⚠️⚠️⚠️
I understand that the llama-cpp backend is missing some features compared to go-llama. However, I can see that there are ongoing efforts to improve and add new backends in LocalAI. The team is also working on integrating with langchain, which could help with the missing features.
If you need to use a specific feature that is not available in llama-cpp, you may want to consider using the go-llama backend instead. You can switch between different backends by updating your LocalAI configuration file or using the CLI command localai config set --backend.
Alternatively, you could also try to contribute to the development of llama-cpp and help improve its features. The team is always looking for contributions from the community to make LocalAI better.
The llama-cpp (not
go-llama
) misses features compared to go-llama, such as:The text was updated successfully, but these errors were encountered: