-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unable to use ollama in the ipex-llm docker container #12654
Comments
i notice there are some unnormal log:
|
@ca1ic0 I can't reproduce your problem on your |
i got this , it seems like the storage map '-v ~/.ollama/models:/root/models' is wrong and the ipexllm works fine. |
it is weird , i reproduce the problem again. Is there any difference between your op and mine? |
this time i directly pull the qwen2.5 without map the |
Do you still meet error? I used your script to start Docker on root@calico-B450M-HDV-R4-0 and followed the steps below to run and got normal results. |
Okay, now I know whats the point😳. Actually it might be casued by the command execution in container:
it i don't execute the command, and directly start ollama in ipex container , it works well. |
Thanks for your feedback. And we did not encounter any problems on ARC770 with Intel CPU. Maybe it was caused by AMD CPU. |
hi maybe i have a simmilar issue, but im not so sure. i have an ubuntu 24 lts vm as base and inside of these i have docker with the ollama instance. i mounted the arc770 into the vm with passthrough and i can see it inside the vm. i have given the container access to the gpu with /dev/dri:/dev/dri and its not working. |
I use PVE as hypervisor before, and the passthrough of a750 works fine. Even the vm os is windows. |
On the Host, i could use ollama and ipex with a Arc750 GPU but,
In the container, i got a fail , the step is :
0.start the container
output:
The text was updated successfully, but these errors were encountered: