-
Notifications
You must be signed in to change notification settings - Fork 9.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Running llama.cpp with sycl in Docker fails with "Unknown PII Error" #5400
Comments
./build/bin/ls-sycl-device
Note, we only resolve the issue in llama.cpp. If your issue is in the migration code from llama.cpp to other project, we can't make sure it works well. You need to reproduce same issue in llama.cpp. |
Re-issuing the same steps now again but this time worked ... so it seems it was something sporadic
No, it works from the host just fine
Well, thanks. LocalAI was one of the first projects supporting llama.cpp and it always will be. We just use the same llama.cpp server code but we have a gRPC Server on top. However, I find this attitude of yours quite an "unfriendly" approach from a downstream project perspective: If LocalAI cannot consume llama.cpp, other project might face the same issues as well, and having documented errors/solution is helpful and inline on how collaborating between Open source projects works in general. As I see it LocalAI is bringing users to know llama.cpp and, as it is a library and can be imported, that opens up to have integration problem as well that makes sense to solve in the llama.cpp codebase. Going to close this one, the problem I had with LocalAI I've solved it by using the intel images directly. Unfortunately using ubuntu 22.04 and following the Intel steps to install the required dependencies didn't worked out and resulted just in a loss of time. |
I have an Intel Arc 770, I'm trying to run llama.cpp with Docker following https://github.com/ggerganov/llama.cpp/blob/master/README-sycl.md but it fails with:
I've been trying also to run it with:
I'm also trying that with LocalAI where I have been creating manually a container with sycl, however there the error is different (PR mudler/LocalAI#1689 ):
Cards detected:
I'm trying this on Ubuntu 22.04 LTS Server (fresh install).
ping (sorry to bug you): @NeoZhangJianyu @airMeng , @luoyu-intel , @abhilash1910, @ggerganov
is Docker supposed to work? am I doing something wrong here?
To note everything works here directly from the host without Docker.
The text was updated successfully, but these errors were encountered: