Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: split core and extra images with additional backends #1162

Closed
9876691 opened this issue Oct 11, 2023 · 5 comments · Fixed by #1309
Closed

feat: split core and extra images with additional backends #1162

9876691 opened this issue Oct 11, 2023 · 5 comments · Fixed by #1309
Assignees
Labels
enhancement New feature or request roadmap

Comments

@9876691
Copy link

9876691 commented Oct 11, 2023

The image is so large that I can't run a Github action as it runs out of space.

quay.io/go-skynet/local-ai v1.30.0  ba9a2b1bff11   2 weeks ago   14GB

Using dive I can see the image has Cuda installed and this takes up a lot of space.

Possible solution

Have a CPU only image with no Cuda?

@9876691 9876691 added the bug Something isn't working label Oct 11, 2023
@mudler
Copy link
Owner

mudler commented Oct 11, 2023

fair point, the images got larger as backends were added, and I guess it will just get fatter due to #1144 .

We could - at least - split the images to a core and an extra version, which comes with all the python backends as well.

@mudler mudler changed the title The docker image is very large 14GB on disk. feat: split core and extra images with additional backends Oct 11, 2023
@mudler mudler added enhancement New feature or request roadmap and removed bug Something isn't working labels Oct 11, 2023
@Aisuko
Copy link
Collaborator

Aisuko commented Oct 11, 2023

Can we split the backend by type? like C++, Rust and Python. These backends can be plugins

@localai-bot
Copy link
Contributor

⚠️⚠️⚠️⚠️⚠️

Hi! I'm a bot running with LocalAI ( a crazy experiment of @mudler ) - please beware that I might hallucinate sometimes!

but.... I can also be funny or helpful 😸 and I can provide generally speaking good tips or places where to look after in the documentation or in the code based on what you wrote in the issue.

Don't engage in conversation with me,
I don't support (yet) replying!

⚠️⚠️⚠️⚠️⚠️

Yes, that is a possible solution. You could create a new Docker image without CUDA and upload it to quay.io. Then you can use that image in your docker-compose file instead of the current one. This will reduce the size of the image and allow you to run Github actions without running out of space.

Sources:

@mudler
Copy link
Owner

mudler commented Nov 3, 2023

this is still much wanted as today - we have now a flag to build smaller images (IMAGE_TYPE=core) but needs to be added to the pipelines

@mudler
Copy link
Owner

mudler commented Nov 20, 2023

#1309 introduces -core images, let's check what builds out of master and see if we can improve the size of those

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request roadmap
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants