-
Notifications
You must be signed in to change notification settings - Fork 272
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add llama-cpp-python #686
Add llama-cpp-python #686
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please take a look at the requested changes, and use the Ready for review button when you are done, thanks 👍 |
@@ -5,3 +5,4 @@ Shapely | |||
smbus_cffi | |||
spidev | |||
aiortc | |||
llama-cpp-python |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It seems that upstream already publishes musl linux wheels, why does it need to be added here as well?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you provide the link to the upstream musl wheels? PyPi still only provides a source tarball and no binaries at all.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm it seems that I looked wrongly. They are indeed not on Pypi. Maybe I've looked at the wrong package.
Have you contacted the author to ask if they are willing to publish Wheels?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks like they're still working on it and don't plan to publish them to PyPi. Not sure if that is compatible with how Home Assistant installs packages so I'll figure something else out. Thanks
hmm. that definitely didn't exist a month ago. thanks for the heads up |
Hey @acon96 @frenck with abetlen/llama-cpp-python#1247 now closed it's possible to install pre-built binary llama-cpp-python wheels using # requirements.txt
--extra-index-url https://abetlen.github.io/llama-cpp-python/whl/cpu
llama-cpp-python This specifies the basic |
Adds wheels for llama-cpp-python to allow running Llama AI models as part of custom integrations.