-
Notifications
You must be signed in to change notification settings - Fork 357
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Examples don't run with CUDA12 #599
Comments
You have probably added both backends CPU and CUDA. That is the reason for the crash. You need to remove one and thus keep only one backend. |
llava_shared.dll is missing in the distribution for CUDA v12. Try to download it from llama.cpp and put it manually into the right runtime folder. |
Took this file from llama.cpp: llama-b2418-bin-win-cublas-cu12.2.0-x64.zip, then got the llava_shared.dll file and put it in LLama.Examples\bin\Debug\net8.0\runtimes\win-x64\native\cuda12. Same problem. |
Try the right version maybe: https://github.com/ggerganov/llama.cpp/tree/d71ac90985854b0905e1abba778e407e17f9f887 |
I will introduce the libraries in the Update Binary artifacts ASAP |
Even despite todays update, this issue persists. |
Hi, it could be confirmed as a BUG since it persists in v0.11.1. Could you please provide some information for us to find the problem? @KieranFoot @EtienneT
|
This seems to be fixed for me now in the latest version. Thanks, |
@AsakusaRinne Apologies, it isn't made clear in the repos docs that additional files are needed to use CUDA12. I assumed it would work out of the box as CUDA11 does. Possibly the documentation could be improved to reflect this. |
@KieranFoot Is it because you installed CUDA12 instead of CUDA11? |
@AsakusaRinne I never installed CUDA11 manually, it just worked. So, when I switched the code to use CUDA12, I wrongly assumed it would also work out of the box. |
It's weird that CUDA11 backend could work without CUDA installed. Have you ever installed cublas? |
You need to update your display driver. Here is a reference: https://tech.amikelive.com/node-930/cuda-compatibility-of-nvidia-display-gpu-drivers/comment-page-1/ |
@martindevans If I'm not misunderstanding it, we could append some cublas files to the same folder of |
Yes, thank you for the clarification. I'll look into this issue. :) |
I have CUDA 12, I know it works because I ran custom pytorch model for other projects.
But when I try to run LlamaSharp.Examples with LLamaSharp.Backend.Cpu, it works fine. But when I try to use it with LLamaSharp.Backend.Cuda12, it crash right away with the following error:
I tried running the project in Debug, also with GPU in the configuration manager, tried running it in .net 8, .net 6, and all combinations but always the same error. I am running the latest version for the nuget packages, 0.10.0.
The text was updated successfully, but these errors were encountered: