NPU support? #6882
sebastienbo
started this conversation in
Ideas
NPU support?
#6882
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Laptops usually don't have gpu's, however these days they come with NPU's, which are much better traversing LLM layers.
It would be cool if llama.cpp could take advantage of them.
Here is the documentation:
https://intel.github.io/intel-npu-acceleration-library/llm.html
Beta Was this translation helpful? Give feedback.
All reactions