Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore(deps): Update Dockerfile #2532

Merged
merged 1 commit into from
Jun 10, 2024
Merged

chore(deps): Update Dockerfile #2532

merged 1 commit into from
Jun 10, 2024

Conversation

reneleonhardt
Copy link
Contributor

Description

Updates

  • Update Go inside Dockerfile to 1.22.4 (security fixes to the go command and crypto/x509, html/template, net/http, net/http/cookiejar, net/mail, net/netip and archive/zip)
  • Update protoc-gen-go to 1.34.1 and protoc-gen-go-grpc to 1.4.0 (fixes CVE-2024-24786)
  • Update CUDA to 11.8
  • Update grpc to 1.64.2 (fixes CVE-2023-44487)
  • Update protoc to 27.1

Notes for Reviewers

Signed commits

  • Yes, I signed my commits.

Copy link

netlify bot commented Jun 10, 2024

Deploy Preview for localai canceled.

Name Link
🔨 Latest commit 15af920
🔍 Latest deploy log https://app.netlify.com/sites/localai/deploys/66669a15960928000868954c

Signed-off-by: Rene Leonhardt <65483435+reneleonhardt@users.noreply.github.com>
Copy link
Owner

@mudler mudler left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @reneleonhardt !

@mudler mudler enabled auto-merge (squash) June 10, 2024 07:15
@reneleonhardt
Copy link
Contributor Author

You're welcome!
The next one will be much bigger 😅
29 security fixes, not counting other Dockerfiles and upstream projects which have not accepted update PRs yet 🔒

@reneleonhardt
Copy link
Contributor Author

@mudler Can you explain shortly why all AI projects use the 2 years old CUDA 11.7 instead of 12.5?
Wouldn't it allow better and faster acceleration? Which AI developer would not update his CUDA stack to the exciting latest version? 🤔
I would expect that at least official Dockerfiles would use it, even if the local build would use the ancient version (out of compatibility reasons?)...

@mudler
Copy link
Owner

mudler commented Jun 10, 2024

@mudler Can you explain shortly why all AI projects use the 2 years old CUDA 11.7 instead of 12.5? Wouldn't it allow better and faster acceleration? Which AI developer would not update his CUDA stack to the exciting latest version? 🤔 I would expect that at least official Dockerfiles would use it, even if the local build would use the ancient version (out of compatibility reasons?)...

compatibility reason mostly - there are many Nvidia HW around that aren't compatible at all with 12. For instance , I even have a jetson nano here that is stuck at 10 (!).

However, would make sense to "default" when running build to 12 - but keeping releasing 11 along for those that have an older HW.

You're welcome! The next one will be much bigger 😅 29 security fixes, not counting other Dockerfiles and upstream projects which have not accepted update PRs yet 🔒

Thank you! really much appreciated!

@mudler mudler merged commit b4d4c0a into mudler:master Jun 10, 2024
32 checks passed
truecharts-admin referenced this pull request in truecharts/public Jun 18, 2024
…7.0 by renovate (#23480)

This PR contains the following updates:

| Package | Update | Change |
|---|---|---|
| [docker.io/localai/localai](https://github.com/mudler/LocalAI) |
minor | `v2.16.0-aio-cpu` -> `v2.17.0-aio-cpu` |
| [docker.io/localai/localai](https://github.com/mudler/LocalAI) |
minor | `v2.16.0-cublas-cuda11-ffmpeg-core` ->
`v2.17.0-cublas-cuda11-ffmpeg-core` |
| [docker.io/localai/localai](https://github.com/mudler/LocalAI) |
minor | `v2.16.0-cublas-cuda11-core` -> `v2.17.0-cublas-cuda11-core` |
| [docker.io/localai/localai](https://github.com/mudler/LocalAI) |
minor | `v2.16.0-cublas-cuda12-ffmpeg-core` ->
`v2.17.0-cublas-cuda12-ffmpeg-core` |
| [docker.io/localai/localai](https://github.com/mudler/LocalAI) |
minor | `v2.16.0-cublas-cuda12-core` -> `v2.17.0-cublas-cuda12-core` |
| [docker.io/localai/localai](https://github.com/mudler/LocalAI) |
minor | `v2.16.0-ffmpeg-core` -> `v2.17.0-ffmpeg-core` |
| [docker.io/localai/localai](https://github.com/mudler/LocalAI) |
minor | `v2.16.0` -> `v2.17.0` |

---

> [!WARNING]
> Some dependencies could not be looked up. Check the Dependency
Dashboard for more information.

---

### Release Notes

<details>
<summary>mudler/LocalAI (docker.io/localai/localai)</summary>

###
[`v2.17.0`](https://github.com/mudler/LocalAI/releases/tag/v2.17.0)

[Compare
Source](https://github.com/mudler/LocalAI/compare/v2.16.0...v2.17.0)

![local-ai-release-2
17-shadow](https://github.com/mudler/LocalAI/assets/2420543/69025f5a-96bd-4ffa-9862-6e651c71345d)
Ahoj! this new release of LocalAI comes with tons of updates, and
enhancements behind the scenes!

##### 🌟 Highlights TLDR;

-   Automatic identification of GGUF models
-   New WebUI page to talk with an LLM!
-   https://models.localai.io is live! 🚀
-   Better arm64 and Apple silicon support
-   More models to the gallery!
-   New quickstart installer script
-   Enhancements to mixed grammar support
-   Major improvements to transformers
-   Linux single binary now supports rocm, nvidia, and intel

##### 🤖 Automatic model identification for llama.cpp-based models

Just drop your GGUF files into the model folders, and let LocalAI handle
the configurations. YAML files are now reserved for those who love to
tinker with advanced setups.

##### 🔊 Talk to your LLM!

Introduced a new page that allows direct interaction with the LLM using
audio transcription and TTS capabilities. This feature is so fun - now
you can just talk with any LLM with a couple of clicks away.
![Screenshot from 2024-06-08
12-44-41](https://github.com/mudler/LocalAI/assets/2420543/c7926eb9-b91f-47dd-be32-68fdb10e6bc7)

##### 🍏 Apple single-binary

Experience enhanced support for the Apple ecosystem with a comprehensive
single-binary that packs all necessary libraries, ensuring LocalAI runs
smoothly on MacOS and ARM64 architectures.

##### ARM64

Expanded our support for ARM64 with new Docker images and single binary
options, ensuring better compatibility and performance on ARM-based
systems.

Note: currently we support only arm core images, for instance:
`localai/localai:master-ffmpeg-core`,
`localai/localai:latest-ffmpeg-core`,
`localai/localai:v2.17.0-ffmpeg-core`.

##### 🐞 Bug Fixes and small enhancements

We’ve ironed out several issues, including image endpoint response types
and other minor problems, boosting the stability and reliability of our
applications. It is now also possible to enable CSRF when starting
LocalAI, thanks to
[@&#8203;dave-gray101](https://github.com/dave-gray101).

##### 🌐 Models and Galleries

Enhanced the model gallery with new additions like Mirai Nova, Mahou,
and several updates to existing models ensuring better performance and
accuracy.

Now you can check new models also in https://models.localai.io, without
running LocalAI!

##### Installation and Setup

A new install.sh script is now available for quick and hassle-free
installations, streamlining the setup process for new users.

    curl https://localai.io/install.sh | sh

Installation can be configured with Environment variables, for example:

    curl https://localai.io/install.sh | VAR=value sh

List of the Environment Variables:

- DOCKER_INSTALL: Set to "true" to enable the installation of Docker
images.
-   USE_AIO: Set to "true" to use the all-in-one LocalAI Docker image.
-   API_KEY: Specify an API key for accessing LocalAI, if required.
-   CORE_IMAGES: Set to "true" to download core LocalAI images.
- PORT: Specifies the port on which LocalAI will run (default is 8080).
- THREADS: Number of processor threads the application should use.
Defaults to the number of logical cores minus one.
- VERSION: Specifies the version of LocalAI to install. Defaults to the
latest available version.
- MODELS_PATH: Directory path where LocalAI models are stored (default
is /usr/share/local-ai/models).

We are looking into improving the installer, and as this is a first
iteration any feedback is welcome! Open up an
[issue](https://github.com/mudler/LocalAI/issues/new/choose) if
something doesn't work for you!

##### Enhancements to mixed grammar support

Mixed grammar support continues receiving improvements behind the
scenes.

##### 🐍  Transformers backend enhancements

-   Temperature = 0 correctly handled as greedy search
-   Handles custom words as stop words
-   Implement KV cache
-   Phi 3 no more requires `trust_remote_code: true` flag

Shout-out to [@&#8203;fakezeta](https://github.com/fakezeta) for these
enhancements!

##### Install models with the CLI

Now the CLI can install models directly from the gallery. For instance:

    local-ai run <model_name_in gallery>

This command ensures the model is installed in the model folder at
startup.

##### 🐧 Linux single binary now supports rocm, nvidia, and intel

Single binaries for Linux now contain Intel, AMD GPU, and NVIDIA
support. Note that you need to install the dependencies separately in
the system to leverage these features. In upcoming releases, this
requirement will be handled by the installer script.

##### 📣 Let's Make Some Noise!

A gigantic THANK YOU to everyone who’s contributed—your feedback, bug
squashing, and feature suggestions are what make LocalAI shine. To all
our heroes out there supporting other users and sharing their expertise,
you’re the real MVPs!

Remember, LocalAI thrives on community support—not big corporate bucks.
If you love what we're building, show some love! A shoutout on social
(@&#8203;LocalAI_OSS and @&#8203;mudler_it on twitter/X), joining our
sponsors, or simply starring us on GitHub makes all the difference.

Also, if you haven't yet joined our Discord, come on over! Here's the
link: https://discord.gg/uJAeKSAGDy

Thanks a ton, and.. enjoy this release!

##### What's Changed

##### Bug fixes 🐛

- fix: gpu fetch device info by
[@&#8203;sozercan](https://github.com/sozercan) in
[https://github.com/mudler/LocalAI/pull/2403](https://github.com/mudler/LocalAI/pull/2403)
- fix(watcher): do not emit fatal errors by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2410](https://github.com/mudler/LocalAI/pull/2410)
- fix: install pytorch from proper index for hipblas builds by
[@&#8203;cryptk](https://github.com/cryptk) in
[https://github.com/mudler/LocalAI/pull/2413](https://github.com/mudler/LocalAI/pull/2413)
- fix: pin version of setuptools for intel builds to work around
[#&#8203;2406](https://github.com/mudler/LocalAI/issues/2406) by
[@&#8203;cryptk](https://github.com/cryptk) in
[https://github.com/mudler/LocalAI/pull/2414](https://github.com/mudler/LocalAI/pull/2414)
- bugfix: CUDA acceleration not working by
[@&#8203;fakezeta](https://github.com/fakezeta) in
[https://github.com/mudler/LocalAI/pull/2475](https://github.com/mudler/LocalAI/pull/2475)
- fix: `pkg/downloader` should respect basePath for `file://` urls by
[@&#8203;dave-gray101](https://github.com/dave-gray101) in
[https://github.com/mudler/LocalAI/pull/2481](https://github.com/mudler/LocalAI/pull/2481)
- fix: chat webui response parsing by
[@&#8203;sozercan](https://github.com/sozercan) in
[https://github.com/mudler/LocalAI/pull/2515](https://github.com/mudler/LocalAI/pull/2515)
- fix(stream): do not break channel consumption by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2517](https://github.com/mudler/LocalAI/pull/2517)
- fix(Makefile): enable STATIC on dist by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2569](https://github.com/mudler/LocalAI/pull/2569)

##### Exciting New Features 🎉

- feat(images): do not install python deps in the core image by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2425](https://github.com/mudler/LocalAI/pull/2425)
- feat(hipblas): extend default hipblas GPU_TARGETS by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2426](https://github.com/mudler/LocalAI/pull/2426)
- feat(build): add arm64 core containers by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2421](https://github.com/mudler/LocalAI/pull/2421)
- feat(functions): allow parallel calls with mixed/no grammars by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2432](https://github.com/mudler/LocalAI/pull/2432)
- feat(image): support `response_type` in the OpenAI API request by
[@&#8203;prajwalnayak7](https://github.com/prajwalnayak7) in
[https://github.com/mudler/LocalAI/pull/2347](https://github.com/mudler/LocalAI/pull/2347)
- feat(swagger): update swagger by
[@&#8203;localai-bot](https://github.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2436](https://github.com/mudler/LocalAI/pull/2436)
- feat(functions): better free string matching, allow to expect strings
after JSON by [@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2445](https://github.com/mudler/LocalAI/pull/2445)
- build(Makefile): add back single target to build native llama-cpp by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2448](https://github.com/mudler/LocalAI/pull/2448)
- feat(functions): allow `response_regex` to be a list by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2447](https://github.com/mudler/LocalAI/pull/2447)
- TTS API improvements by [@&#8203;blob42](https://github.com/blob42)
in
[https://github.com/mudler/LocalAI/pull/2308](https://github.com/mudler/LocalAI/pull/2308)
- feat(transformers): various enhancements to the transformers backend
by [@&#8203;fakezeta](https://github.com/fakezeta) in
[https://github.com/mudler/LocalAI/pull/2468](https://github.com/mudler/LocalAI/pull/2468)
- feat(webui): enhance card visibility by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2473](https://github.com/mudler/LocalAI/pull/2473)
- feat(default): use number of physical cores as default by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2483](https://github.com/mudler/LocalAI/pull/2483)
- feat: fiber CSRF by
[@&#8203;dave-gray101](https://github.com/dave-gray101) in
[https://github.com/mudler/LocalAI/pull/2482](https://github.com/mudler/LocalAI/pull/2482)
- feat(amdgpu): try to build in single binary by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2485](https://github.com/mudler/LocalAI/pull/2485)
- feat:`OpaqueErrors` to hide error information by
[@&#8203;dave-gray101](https://github.com/dave-gray101) in
[https://github.com/mudler/LocalAI/pull/2486](https://github.com/mudler/LocalAI/pull/2486)
- build(intel): bundle intel variants in single-binary by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2494](https://github.com/mudler/LocalAI/pull/2494)
- feat(install): add install.sh for quick installs by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2489](https://github.com/mudler/LocalAI/pull/2489)
- feat(llama.cpp): guess model defaults from file by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2522](https://github.com/mudler/LocalAI/pull/2522)
- feat(ui): add page to talk with voice, transcription, and tts by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2520](https://github.com/mudler/LocalAI/pull/2520)
- feat(arm64): enable single-binary builds by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2490](https://github.com/mudler/LocalAI/pull/2490)
- feat(util): add util command to print GGUF informations by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2528](https://github.com/mudler/LocalAI/pull/2528)
- feat(defaults): add defaults for Command-R models by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2529](https://github.com/mudler/LocalAI/pull/2529)
- feat(detection): detect by template in gguf file, add qwen2, phi,
mistral and chatml by [@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2536](https://github.com/mudler/LocalAI/pull/2536)
- feat(gallery): show available models in website, allow `local-ai
models install` to install from galleries by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2555](https://github.com/mudler/LocalAI/pull/2555)
- feat(gallery): uniform download from CLI by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2559](https://github.com/mudler/LocalAI/pull/2559)
- feat(guesser): identify gemma models by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2561](https://github.com/mudler/LocalAI/pull/2561)
- feat(binary): support extracted bundled libs on darwin by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2563](https://github.com/mudler/LocalAI/pull/2563)
- feat(darwin): embed grpc libs by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2567](https://github.com/mudler/LocalAI/pull/2567)
- feat(build): bundle libs for arm64 and x86 linux binaries by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2572](https://github.com/mudler/LocalAI/pull/2572)
- feat(libpath): refactor and expose functions for external library
paths by [@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2578](https://github.com/mudler/LocalAI/pull/2578)

##### 🧠 Models

- models(gallery): add Mirai Nova by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2405](https://github.com/mudler/LocalAI/pull/2405)
- models(gallery): add Mahou by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2411](https://github.com/mudler/LocalAI/pull/2411)
- models(gallery): add minicpm by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2412](https://github.com/mudler/LocalAI/pull/2412)
- models(gallery): add poppy porpoise 0.85 by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2415](https://github.com/mudler/LocalAI/pull/2415)
- models(gallery): add alpha centauri by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2416](https://github.com/mudler/LocalAI/pull/2416)
- models(gallery): add cream-phi-13b by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2417](https://github.com/mudler/LocalAI/pull/2417)
- models(gallery): add stheno-mahou by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2418](https://github.com/mudler/LocalAI/pull/2418)
- models(gallery): add iterative-dpo, fix minicpm by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2422](https://github.com/mudler/LocalAI/pull/2422)
- models(gallery): add una-thepitbull by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2435](https://github.com/mudler/LocalAI/pull/2435)
- models(gallery): add halu by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2434](https://github.com/mudler/LocalAI/pull/2434)
- models(gallery): add neuraldaredevil by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2439](https://github.com/mudler/LocalAI/pull/2439)
- models(gallery): add Codestral by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2442](https://github.com/mudler/LocalAI/pull/2442)
- models(gallery): add mopeymule by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2449](https://github.com/mudler/LocalAI/pull/2449)
- models(gallery): ⬆️ update checksum by
[@&#8203;localai-bot](https://github.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2451](https://github.com/mudler/LocalAI/pull/2451)
- models(gallery): add anjir by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2454](https://github.com/mudler/LocalAI/pull/2454)
- models(gallery): add llama3-11b by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2455](https://github.com/mudler/LocalAI/pull/2455)
- models(gallery): add ultron by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2456](https://github.com/mudler/LocalAI/pull/2456)
- models(gallery): add poppy porpoise 1.0 by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2459](https://github.com/mudler/LocalAI/pull/2459)
- models(gallery): add Neural SOVLish Devil by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2460](https://github.com/mudler/LocalAI/pull/2460)
- models(gallery): add all whisper variants by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2462](https://github.com/mudler/LocalAI/pull/2462)
- models(gallery): ⬆️ update checksum by
[@&#8203;localai-bot](https://github.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2463](https://github.com/mudler/LocalAI/pull/2463)
- models(gallery): add gemma-2b by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2466](https://github.com/mudler/LocalAI/pull/2466)
- models(gallery): add fimbulvetr iqmatrix version by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2470](https://github.com/mudler/LocalAI/pull/2470)
- models(gallery): add new poppy porpoise versions by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2471](https://github.com/mudler/LocalAI/pull/2471)
- models(gallery): add dolphin-2.9.2-Phi-3-Medium by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2492](https://github.com/mudler/LocalAI/pull/2492)
- models(gallery): add dolphin-2.9.2-phi-3-Medium-abliterated by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2495](https://github.com/mudler/LocalAI/pull/2495)
- models(gallery): add nyun by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2496](https://github.com/mudler/LocalAI/pull/2496)
- models(gallery): add phi-3-4x4b by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2497](https://github.com/mudler/LocalAI/pull/2497)
- models(gallery): add llama-3-instruct-8b-SimPO-ExPO by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2498](https://github.com/mudler/LocalAI/pull/2498)
- models(gallery): add Llama-3-Yggdrasil-2.0-8B by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2499](https://github.com/mudler/LocalAI/pull/2499)
- models(gallery): add l3-8b-stheno-v3.2-iq-imatrix by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2500](https://github.com/mudler/LocalAI/pull/2500)
- models(gallery): add llama3-8B-aifeifei-1.0-iq-imatrix by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2509](https://github.com/mudler/LocalAI/pull/2509)
- models(gallery): add rawr_llama3\_8b-iq-imatrix by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2510](https://github.com/mudler/LocalAI/pull/2510)
- models(gallery): add llama3-8b-feifei-1.0-iq-imatrix by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2511](https://github.com/mudler/LocalAI/pull/2511)
- models(gallery): ⬆️ update checksum by
[@&#8203;localai-bot](https://github.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2519](https://github.com/mudler/LocalAI/pull/2519)
- models(gallery): add llama3-8B-aifeifei-1.2-iq-imatrix by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2544](https://github.com/mudler/LocalAI/pull/2544)
- models(gallery): add hathor-l3-8b-v.01-iq-imatrix by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2545](https://github.com/mudler/LocalAI/pull/2545)
- models(gallery): add l3-aethora-15b by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2546](https://github.com/mudler/LocalAI/pull/2546)
- models(gallery): add llama-salad-8x8b by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2547](https://github.com/mudler/LocalAI/pull/2547)
- models(gallery): add average_normie_v3.69\_8b-iq-imatrix by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2548](https://github.com/mudler/LocalAI/pull/2548)
- models(gallery): add duloxetine by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2549](https://github.com/mudler/LocalAI/pull/2549)
- models(gallery): add badger-lambda-llama-3-8b by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2550](https://github.com/mudler/LocalAI/pull/2550)
- models(gallery): add firefly-gemma-7b by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2576](https://github.com/mudler/LocalAI/pull/2576)
- models(gallery): add dolphin-qwen by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2580](https://github.com/mudler/LocalAI/pull/2580)
- models(gallery): add tess-v2.5-phi-3-medium-128k-14b by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2581](https://github.com/mudler/LocalAI/pull/2581)
- models(gallery): add hathor_stable-v0.2-l3-8b by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2582](https://github.com/mudler/LocalAI/pull/2582)
- models(gallery): add samantha-qwen2 by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2586](https://github.com/mudler/LocalAI/pull/2586)
- models(gallery): add gemma-1.1-7b-it by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2588](https://github.com/mudler/LocalAI/pull/2588)

##### 📖 Documentation and examples

- Update quickstart.md by [@&#8203;mudler](https://github.com/mudler)
in
[https://github.com/mudler/LocalAI/pull/2404](https://github.com/mudler/LocalAI/pull/2404)
- docs: fix p2p commands by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2472](https://github.com/mudler/LocalAI/pull/2472)
- README: update sponsors list by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2476](https://github.com/mudler/LocalAI/pull/2476)
- Add integrations by [@&#8203;reid41](https://github.com/reid41) in
[https://github.com/mudler/LocalAI/pull/2535](https://github.com/mudler/LocalAI/pull/2535)
- docs(gallery): lazy-load images by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2557](https://github.com/mudler/LocalAI/pull/2557)
- Fix standard image latest Docker tags by
[@&#8203;nwithan8](https://github.com/nwithan8) in
[https://github.com/mudler/LocalAI/pull/2574](https://github.com/mudler/LocalAI/pull/2574)

##### 👒 Dependencies

- ⬆️ Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://github.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2399](https://github.com/mudler/LocalAI/pull/2399)
- ⬆️ Update docs version mudler/LocalAI by
[@&#8203;localai-bot](https://github.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2398](https://github.com/mudler/LocalAI/pull/2398)
- ⬆️ Update ggerganov/whisper.cpp by
[@&#8203;localai-bot](https://github.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2408](https://github.com/mudler/LocalAI/pull/2408)
- ⬆️ Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://github.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2409](https://github.com/mudler/LocalAI/pull/2409)
- ⬆️ Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://github.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2419](https://github.com/mudler/LocalAI/pull/2419)
- ⬆️ Update ggerganov/whisper.cpp by
[@&#8203;localai-bot](https://github.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2427](https://github.com/mudler/LocalAI/pull/2427)
- ⬆️ Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://github.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2428](https://github.com/mudler/LocalAI/pull/2428)
- ⬆️ Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://github.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2433](https://github.com/mudler/LocalAI/pull/2433)
- ⬆️ Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://github.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2437](https://github.com/mudler/LocalAI/pull/2437)
- ⬆️ Update ggerganov/whisper.cpp by
[@&#8203;localai-bot](https://github.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2438](https://github.com/mudler/LocalAI/pull/2438)
- ⬆️ Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://github.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2444](https://github.com/mudler/LocalAI/pull/2444)
- ⬆️ Update ggerganov/whisper.cpp by
[@&#8203;localai-bot](https://github.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2443](https://github.com/mudler/LocalAI/pull/2443)
- ⬆️ Update ggerganov/whisper.cpp by
[@&#8203;localai-bot](https://github.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2452](https://github.com/mudler/LocalAI/pull/2452)
- ⬆️ Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://github.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2453](https://github.com/mudler/LocalAI/pull/2453)
- ⬆️ Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://github.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2465](https://github.com/mudler/LocalAI/pull/2465)
- ⬆️ Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://github.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2467](https://github.com/mudler/LocalAI/pull/2467)
- ⬆️ Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://github.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2477](https://github.com/mudler/LocalAI/pull/2477)
- toil: bump grpc version by
[@&#8203;dave-gray101](https://github.com/dave-gray101) in
[https://github.com/mudler/LocalAI/pull/2480](https://github.com/mudler/LocalAI/pull/2480)
- ⬆️ Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://github.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2487](https://github.com/mudler/LocalAI/pull/2487)
- ⬆️ Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://github.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2493](https://github.com/mudler/LocalAI/pull/2493)
- deps(whisper): update, add libcufft-dev by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2501](https://github.com/mudler/LocalAI/pull/2501)
- ⬆️ Update ggerganov/whisper.cpp by
[@&#8203;localai-bot](https://github.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2507](https://github.com/mudler/LocalAI/pull/2507)
- ⬆️ Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://github.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2508](https://github.com/mudler/LocalAI/pull/2508)
- ⬆️ Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://github.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2518](https://github.com/mudler/LocalAI/pull/2518)
- ⬆️ Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://github.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2524](https://github.com/mudler/LocalAI/pull/2524)
- ⬆️ Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://github.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2531](https://github.com/mudler/LocalAI/pull/2531)
- chore(deps): Update Dockerfile by
[@&#8203;reneleonhardt](https://github.com/reneleonhardt) in
[https://github.com/mudler/LocalAI/pull/2532](https://github.com/mudler/LocalAI/pull/2532)
- ⬆️ Update ggerganov/whisper.cpp by
[@&#8203;localai-bot](https://github.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2539](https://github.com/mudler/LocalAI/pull/2539)
- ⬆️ Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://github.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2540](https://github.com/mudler/LocalAI/pull/2540)
- ⬆️ Update ggerganov/whisper.cpp by
[@&#8203;localai-bot](https://github.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2552](https://github.com/mudler/LocalAI/pull/2552)
- ⬆️ Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://github.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2551](https://github.com/mudler/LocalAI/pull/2551)
- ⬆️ Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://github.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2554](https://github.com/mudler/LocalAI/pull/2554)
- ⬆️ Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://github.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2564](https://github.com/mudler/LocalAI/pull/2564)
- ⬆️ Update ggerganov/whisper.cpp by
[@&#8203;localai-bot](https://github.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2565](https://github.com/mudler/LocalAI/pull/2565)
- ⬆️ Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://github.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2570](https://github.com/mudler/LocalAI/pull/2570)
- ⬆️ Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://github.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2575](https://github.com/mudler/LocalAI/pull/2575)
- ⬆️ Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://github.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2584](https://github.com/mudler/LocalAI/pull/2584)
- ⬆️ Update ggerganov/llama.cpp by
[@&#8203;localai-bot](https://github.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2587](https://github.com/mudler/LocalAI/pull/2587)

##### Other Changes

- ci: fix sd release by
[@&#8203;sozercan](https://github.com/sozercan) in
[https://github.com/mudler/LocalAI/pull/2400](https://github.com/mudler/LocalAI/pull/2400)
- ci(grpc-cache): also arm64 by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2423](https://github.com/mudler/LocalAI/pull/2423)
- ci: push test images when building PRs by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2424](https://github.com/mudler/LocalAI/pull/2424)
- ci: pin build-time protoc by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2461](https://github.com/mudler/LocalAI/pull/2461)
- feat(swagger): update swagger by
[@&#8203;localai-bot](https://github.com/localai-bot) in
[https://github.com/mudler/LocalAI/pull/2464](https://github.com/mudler/LocalAI/pull/2464)
- ci: run release build on self-hosted runners by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2505](https://github.com/mudler/LocalAI/pull/2505)
- experiment: `-j4` for `build-linux:` by
[@&#8203;dave-gray101](https://github.com/dave-gray101) in
[https://github.com/mudler/LocalAI/pull/2514](https://github.com/mudler/LocalAI/pull/2514)
- test: e2e /reranker endpoint by
[@&#8203;dave-gray101](https://github.com/dave-gray101) in
[https://github.com/mudler/LocalAI/pull/2211](https://github.com/mudler/LocalAI/pull/2211)
- ci: pack less libs inside the binary by
[@&#8203;mudler](https://github.com/mudler) in
[https://github.com/mudler/LocalAI/pull/2579](https://github.com/mudler/LocalAI/pull/2579)

##### New Contributors

- [@&#8203;prajwalnayak7](https://github.com/prajwalnayak7) made their
first contribution in
[https://github.com/mudler/LocalAI/pull/2347](https://github.com/mudler/LocalAI/pull/2347)
- [@&#8203;reneleonhardt](https://github.com/reneleonhardt) made their
first contribution in
[https://github.com/mudler/LocalAI/pull/2532](https://github.com/mudler/LocalAI/pull/2532)
- [@&#8203;reid41](https://github.com/reid41) made their first
contribution in
[https://github.com/mudler/LocalAI/pull/2535](https://github.com/mudler/LocalAI/pull/2535)
- [@&#8203;nwithan8](https://github.com/nwithan8) made their first
contribution in
[https://github.com/mudler/LocalAI/pull/2574](https://github.com/mudler/LocalAI/pull/2574)

**Full Changelog**:
mudler/LocalAI@v2.16.0...v2.17.0

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined),
Automerge - At any time (no schedule defined).

🚦 **Automerge**: Enabled.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about these
updates again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR has been generated by [Renovate
Bot](https://github.com/renovatebot/renovate).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzNy40MTAuMiIsInVwZGF0ZWRJblZlciI6IjM3LjQxMC4yIiwidGFyZ2V0QnJhbmNoIjoibWFzdGVyIiwibGFiZWxzIjpbImF1dG9tZXJnZSIsInVwZGF0ZS9kb2NrZXIvZ2VuZXJhbC9ub24tbWFqb3IiXX0=-->
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants