Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore(deps): update container image docker.io/localai/localai to v2.20.0 by renovate #25426

Merged
merged 1 commit into from
Aug 23, 2024

Conversation

truecharts-admin
Copy link
Collaborator

This PR contains the following updates:

Package Update Change
docker.io/localai/localai minor v2.19.4-aio-cpu -> v2.20.0-aio-cpu
docker.io/localai/localai minor v2.19.4-aio-gpu-nvidia-cuda-12 -> v2.20.0-aio-gpu-nvidia-cuda-12
docker.io/localai/localai minor v2.19.4-cublas-cuda11-ffmpeg-core -> v2.20.0-cublas-cuda11-ffmpeg-core
docker.io/localai/localai minor v2.19.4-cublas-cuda11-core -> v2.20.0-cublas-cuda11-core
docker.io/localai/localai minor v2.19.4-cublas-cuda12-core -> v2.20.0-cublas-cuda12-core
docker.io/localai/localai minor v2.19.4-ffmpeg-core -> v2.20.0-ffmpeg-core
docker.io/localai/localai minor v2.19.4 -> v2.20.0

Warning

Some dependencies could not be looked up. Check the Dependency Dashboard for more information.


Release Notes

mudler/LocalAI (docker.io/localai/localai)

v2.20.0

Compare Source

local-ai-release-2 20-shadow4

TL;DR
  • 🌍 Explorer & Community: Explore global community pools at explorer.localai.io
  • 👀 Demo instance available: Test out LocalAI at demo.localai.io
  • 🤗 Integration: Hugging Face Local apps now include LocalAI
  • 🐛 Bug Fixes: Diffusers and hipblas issues resolved
  • 🎨 New Feature: FLUX-1 image generation support
  • 🏎️ Strict Mode: Stay compliant with OpenAI’s latest API changes
  • 💪 Multiple P2P Clusters: Run multiple clusters within the same network
  • 🧪 Deprecation Notice: gpt4all.cpp and petals backends deprecated

🌍 Explorer and Global Community Pools

Now you can share your LocalAI instance with the global community or explore available instances by visiting explorer.localai.io. This decentralized network powers our demo instance, creating a truly collaborative AI experience.

Explorer Global Community Pools

How It Works

Using the Explorer, you can easily share or connect to clusters. For detailed instructions on creating new clusters or connecting to existing ones, check out our documentation.

👀 Demo Instance Now Available

Curious about what LocalAI can do? Dive right in with our live demo at demo.localai.io! Thanks to our generous sponsors, this instance is publicly available and configured via peer-to-peer (P2P) networks. If you'd like to connect, follow the instructions here.

🤗 Hugging Face Integration

I am excited to announce that LocalAI is now integrated within Hugging Face’s local apps! This means you can select LocalAI directly within Hugging Face to build and deploy models with the power and flexibility of our platform. Experience seamless integration with a single click!

Hugging Face Integration Screenshot Hugging Face Integration in Action

This integration was made possible through this PR.

🎨 FLUX-1 Image Generation Support

FLUX-1 lands in LocalAI! With this update, LocalAI can now generate stunning images using FLUX-1, even in federated mode. Whether you're experimenting with new designs or creating production-quality visuals, FLUX-1 has you covered.

Try it out at demo.localai.io and see what LocalAI + FLUX-1 can do!

FLUX-1 Image Generation Example

🐛 Diffusers and hipblas Fixes

Great news for AMD users! If you’ve encountered issues with the Diffusers backend or hipblas, those bugs have been resolved. We’ve transitioned to uv for managing Python dependencies, ensuring a smoother experience. For more details, check out Issue #​1592.

🏎️ Strict Mode for API Compliance

To stay up to date with OpenAI’s latest changes, now LocalAI have support as well for Strict Mode ( https://openai.com/index/introducing-structured-outputs-in-the-api/ ). This new feature ensures compatibility with the most recent API updates, enforcing stricter JSON outputs using BNF grammar rules.

To activate, simply set strict: true in your API calls, even if it’s disabled in your configuration.

Key Notes:
  • Setting strict: true enables grammar enforcement, even if disabled in your config.
  • If format_type is set to json_schema, BNF grammars will be automatically generated from the schema.
🛑 Disable Gallery

Need to streamline your setup? You can now disable the gallery endpoint using LOCALAI_DISABLE_GALLERY_ENDPOINT. For more options, check out the full list of commands with --help.

🌞 P2P and Federation Enhancements

Several enhancements have been made to improve your experience with P2P and federated clusters:

  • Load Balancing by Default: This feature is now enabled by default (disable it with LOCALAI_RANDOM_WORKER if needed).
  • Target Specific Workers: Directly target workers in federated mode using LOCALAI_TARGET_WORKER.
💪 Run Multiple P2P Clusters in the Same Network

You can now run multiple clusters within the same network by specifying a network ID via CLI. This allows you to logically separate clusters while using the same shared token. Just set LOCALAI_P2P_NETWORK_ID to a UUID that matches across instances.

Please note, while this offers segmentation, it’s not fully secure—anyone with the network token can view available services within the network.

🧪 Deprecation Notice: gpt4all.cpp and petals Backends

As we continue to evolve, we are officially deprecating the gpt4all.cpp and petals backends. The newer llama.cpp offers a superior set of features and better performance, making it the preferred choice moving forward.

From this release onward, gpt4all models in ggml format are no longer compatible. Additionally, the petals backend has been deprecated as well. LocalAI’s new P2P capabilities now offer a comprehensive replacement for these features.

What's Changed
Breaking Changes 🛠
Bug fixes 🐛
Exciting New Features 🎉
🧠 Models
📖 Documentation and examples
👒 Dependencies
Other Changes
New Contributors

Full Changelog: mudler/LocalAI@v2.19.4...v2.20.0


Configuration

📅 Schedule: Branch creation - At any time (no schedule defined), Automerge - At any time (no schedule defined).

🚦 Automerge: Enabled.

Rebasing: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.

🔕 Ignore: Close this PR and you won't be reminded about these updates again.


  • If you want to rebase/retry this PR, check this box

This PR has been generated by Renovate Bot.

@truecharts-admin truecharts-admin added the automerge Categorises a PR or issue that references a new App. label Aug 23, 2024
@truecharts-admin truecharts-admin enabled auto-merge (squash) August 23, 2024 00:35
Copy link

📝 Linting results:

✔️ Linting [charts/stable/local-ai]: Passed - Took 1 seconds
Total Charts Linted: 1
Total Charts Passed: 1
Total Charts Failed: 0

✅ Linting: Passed - Took 1 seconds

@truecharts-admin truecharts-admin merged commit 2facf81 into master Aug 23, 2024
14 checks passed
@truecharts-admin truecharts-admin deleted the renovate/docker.io-localai-localai-2.x branch August 23, 2024 00:39
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
automerge Categorises a PR or issue that references a new App.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants