-
Notifications
You must be signed in to change notification settings - Fork 23
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can't import vicuna models : (bad f16 value 5)
#7
Comments
Same for me with model Output from llama.cpp
Output from pyllamacpp
|
Original page have been archived, but links are still available here : https://github.com/nomic-ai/gpt4all/tree/main/gpt4all-chat |
Thanks @mh4ckt3mh4ckt1c4s for reporting the issue. |
Hi guys, I pushed a new release |
Hello, I tested with Vicuna and it works with 2.2.0 but not with the latest 2.3.0. Is that normal ? |
@mh4ckt3mh4ckt1c4s Yes it is normal as |
Okay, so from my point of view this issue is closed. Thanks for your work ! |
When I try to load the vicuna models downloaded from this page, I have the following error :
I do not have this problem when using the gpt4all models. Running the vicuna models with the latest version of llama.cpp works just fine.
The text was updated successfully, but these errors were encountered: