-
Notifications
You must be signed in to change notification settings - Fork 357
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LLamaSharp 0.11.2 Exception #660
Comments
The fundamental error: |
@martindevans Got it! thanks. |
I'm currently having the exact same problem after upgrading from version 10 to 11.2. I did not change the model parameters. I tried reducing the number of tokens as advised, but the error remained. |
That's odd. The NoKvSlot error is pretty much passed straight through from llama.cpp, there's not a lot going on on the C# side that could be a problem there. Can you tell us some more details - what are your loading parameters? What model are you using? How much text are you evaluating etc? Edit: Also, is this dotnet7.0 only? Or are you using a different version? |
Model: open-chat-3.5-0106 Model Params: Inference Params: With these parameters everything worked fine in previous versions. Use .net 8.0 :) |
Those settings look fine :/ As far as I'm aware the only way this should raise I assume you're getting this well short of 1024 tokens? |
I tried increasing the |
I tried to change |
It might help if you're careful with your numbers. For example:
If expanding the context size fixes it, it sounds to me like you're simply using up all your available token space. |
This issue is expected to be fixed in the current master branch. Could you please try again with the master branch? |
In version 0.12.0 problem resolved itself :| |
Thank you for your feedback, closing this issue as completed now. Please feel free to comment here if the problem reappears. |
Hi, When I am running the LLamaSharp 0.11.2 with DotNet 7, I got a exception suddenly as below how can I fixed it?
The text was updated successfully, but these errors were encountered: