Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add LocalAI to local-apps #833

Merged
merged 9 commits into from
Aug 9, 2024
Merged

Add LocalAI to local-apps #833

merged 9 commits into from
Aug 9, 2024

Conversation

mudler
Copy link
Contributor

@mudler mudler commented Aug 4, 2024

Hello 👋

This PR adds LocalAI (https://localai.io) to the huggingface local-apps. It natively supports models directly from huggingface in gguf format by running localai e.g. with huggingface://repo-id/file ( docs: https://localai.io/docs/getting-started/models/#run-models-via-uri ) .

It also supports transformers - however it needs a configuration file and can't be natively used, so piggybacking only to gguf files for now.

This adds LocalAI (https://localai.io) to the huggingface local-apps. It natively supports models directly from huggingface.
@mudler mudler changed the title feat: add LocalAI to local-apps Add LocalAI to local-apps Aug 4, 2024
Copy link
Member

@Vaibhavs10 Vaibhavs10 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Conceptually LGTM! I haven't tested this yet, does it require any specific setup for Mac or CUDA devices?

@mudler
Copy link
Contributor Author

mudler commented Aug 7, 2024

Conceptually LGTM! I haven't tested this yet, does it require any specific setup for Mac or CUDA devices?

Only if using docker - you would need to pull specific images for e.g. CUDA. If using the single-binary both Mac and CUDA would work. That is however handled automatically by using the install script!

@enzostvs
Copy link
Member

enzostvs commented Aug 9, 2024

Hi @mudler
Can you send us your logo in SVG format please ? 🤗

@mudler
Copy link
Contributor Author

mudler commented Aug 9, 2024

Hi @mudler Can you send us your logo in SVG format please ? 🤗

Sure! There you are: logo.zip ( credits to @enricoros 🫶 )

Had to zip it because for some reason GH wasn't liking it 🤷‍♂️

@@ -99,6 +99,18 @@ const snippetLlamacpp = (model: ModelData, filepath?: string): LocalAppSnippet[]
];
};

const snippetLocalAI = (model: ModelData, filepath?: string): string[] => {
return [
`# Option 1: use LocalAI with docker
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One last comment on this:

Let's switch the order of the snippets, as I assume the number of people who will use this with docker are far less than directly from the binaries.

Question for @enzostvs - do you think doing something like we did for llama.cpp here:

const snippetLlamacpp = (model: ModelData, filepath?: string): LocalAppSnippet[] => {
(since we have two snippets make sense)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, I agree
Would be better to split them in two objects, as we did for llama.cpp 👍

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

inverted as requested in 3fca3d6 !

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perfect! @mudler sorry for the back and forth, do you think you'll be able to split the snippets the way I linked above? This way the users will have a much better readability experience on the hub.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perfect! @mudler sorry for the back and forth, do you think you'll be able to split the snippets the way I linked above? This way the users will have a much better readability experience on the hub.

I can try, javascript isn't really my cup of tea 😅

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Updated in 1d5064c - let me know if that looks good!

mudler added 2 commits August 9, 2024 16:32
address comments from review

Signed-off-by: mudler <mudler@localai.io>
Signed-off-by: mudler <mudler@localai.io>
const snippetLocalAI = (model: ModelData, filepath?: string): LocalAppSnippet[] => {
const command = (binary: string) =>
[
`huggingface://${model.id}/${filepath ?? "{{GGUF_FILE}}"}`,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The actual commands seem to be missing (i.e., binary is unused) :)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

oops! thanks, fixed in 501c1e9

@enzostvs enzostvs self-requested a review August 9, 2024 14:50
Signed-off-by: mudler <mudler@localai.io>
Copy link
Member

@Vaibhavs10 Vaibhavs10 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice! LGTM! Can you run the linter on this so that the code is formatted properly? :🙏

If you are using VS code then you can just install the prettier extension and it'd automatically format it for you upon save! 🤗

Signed-off-by: mudler <mudler@localai.io>
@mudler
Copy link
Contributor Author

mudler commented Aug 9, 2024

Nice! LGTM! Can you run the linter on this so that the code is formatted properly? :🙏

If you are using VS code then you can just install the prettier extension and it'd automatically format it for you upon save! 🤗

that did the trick, thank you!

@Vaibhavs10 Vaibhavs10 merged commit fd1d98a into huggingface:main Aug 9, 2024
4 checks passed
@Vaibhavs10
Copy link
Member

Thanks again for the contribution @mudler 🤗

@mudler
Copy link
Contributor Author

mudler commented Aug 9, 2024

Thanks again for the contribution @mudler 🤗

Thank you for the quick review! 🫶

@mudler mudler deleted the patch-1 branch August 9, 2024 16:56
@enzostvs
Copy link
Member

enzostvs commented Aug 14, 2024

Hi @mudler
LocalAI has been added to the local apps on the HF hub 🤗
You could communicate on it too, would be really nice! 🥳
Thanks again for your contribution!
Screenshot 2024-08-14 at 9 35 49 AM

Screenshot 2024-08-14 at 9 39 47 AM

@mudler
Copy link
Contributor Author

mudler commented Aug 14, 2024

Hi @mudler LocalAI has been added to the local apps on the HF hub 🤗 You could communicate on it too, would be really nice! 🥳 Thanks again for your contribution! Screenshot 2024-08-14 at 9 35 49 AM
Screenshot 2024-08-14 at 9 39 47 AM

Thank you! this is awesome!! Thanks for the ping! will definitely do that 🫶

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants