-
Notifications
You must be signed in to change notification settings - Fork 278
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add LocalAI to local-apps #833
Conversation
This adds LocalAI (https://localai.io) to the huggingface local-apps. It natively supports models directly from huggingface.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Conceptually LGTM! I haven't tested this yet, does it require any specific setup for Mac or CUDA devices?
Only if using docker - you would need to pull specific images for e.g. CUDA. If using the single-binary both Mac and CUDA would work. That is however handled automatically by using the install script! |
Hi @mudler |
Sure! There you are: logo.zip ( credits to @enricoros 🫶 ) Had to zip it because for some reason GH wasn't liking it 🤷♂️ |
packages/tasks/src/local-apps.ts
Outdated
@@ -99,6 +99,18 @@ const snippetLlamacpp = (model: ModelData, filepath?: string): LocalAppSnippet[] | |||
]; | |||
}; | |||
|
|||
const snippetLocalAI = (model: ModelData, filepath?: string): string[] => { | |||
return [ | |||
`# Option 1: use LocalAI with docker |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
One last comment on this:
Let's switch the order of the snippets, as I assume the number of people who will use this with docker are far less than directly from the binaries.
Question for @enzostvs - do you think doing something like we did for llama.cpp here:
const snippetLlamacpp = (model: ModelData, filepath?: string): LocalAppSnippet[] => { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, I agree
Would be better to split them in two objects, as we did for llama.cpp 👍
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
inverted as requested in 3fca3d6 !
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Perfect! @mudler sorry for the back and forth, do you think you'll be able to split the snippets the way I linked above? This way the users will have a much better readability
experience on the hub.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Perfect! @mudler sorry for the back and forth, do you think you'll be able to split the snippets the way I linked above? This way the users will have a much better
readability
experience on the hub.
I can try, javascript isn't really my cup of tea 😅
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Updated in 1d5064c - let me know if that looks good!
address comments from review Signed-off-by: mudler <mudler@localai.io>
Signed-off-by: mudler <mudler@localai.io>
packages/tasks/src/local-apps.ts
Outdated
const snippetLocalAI = (model: ModelData, filepath?: string): LocalAppSnippet[] => { | ||
const command = (binary: string) => | ||
[ | ||
`huggingface://${model.id}/${filepath ?? "{{GGUF_FILE}}"}`, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The actual commands seem to be missing (i.e., binary
is unused) :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
oops! thanks, fixed in 501c1e9
Signed-off-by: mudler <mudler@localai.io>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice! LGTM! Can you run the linter on this so that the code is formatted properly? :🙏
If you are using VS code then you can just install the prettier
extension and it'd automatically format it for you upon save! 🤗
Signed-off-by: mudler <mudler@localai.io>
that did the trick, thank you! |
Thanks again for the contribution @mudler 🤗 |
Thank you for the quick review! 🫶 |
Hi @mudler |
Thank you! this is awesome!! Thanks for the ping! will definitely do that 🫶 |
Hello 👋
This PR adds LocalAI (https://localai.io) to the huggingface local-apps. It natively supports models directly from huggingface in gguf format by running localai e.g. with
huggingface://repo-id/file
( docs: https://localai.io/docs/getting-started/models/#run-models-via-uri ) .It also supports transformers - however it needs a configuration file and can't be natively used, so piggybacking only to gguf files for now.