-
-
Notifications
You must be signed in to change notification settings - Fork 1.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(parler-tts): Add new backend #2027
Merged
Merged
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
✅ Deploy Preview for localai canceled.
|
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
This reverts commit bd5941d. Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
problems with using protobuf go down to: descriptinc/audiotools#101 |
dave-gray101
approved these changes
Apr 13, 2024
@mudler snagged on Otherwise, it looks good - I think an extra environment is a worthy price for another backend in this case. |
truecharts-admin
referenced
this pull request
in truecharts/public
Apr 27, 2024
…3.0 by renovate (#21421) This PR contains the following updates: | Package | Update | Change | |---|---|---| | [docker.io/localai/localai](https://github.com/mudler/LocalAI) | minor | `v2.12.4-cublas-cuda11-ffmpeg-core` -> `v2.13.0-cublas-cuda11-ffmpeg-core` | | [docker.io/localai/localai](https://github.com/mudler/LocalAI) | minor | `v2.12.4-cublas-cuda11-core` -> `v2.13.0-cublas-cuda11-core` | | [docker.io/localai/localai](https://github.com/mudler/LocalAI) | minor | `v2.12.4-cublas-cuda12-ffmpeg-core` -> `v2.13.0-cublas-cuda12-ffmpeg-core` | | [docker.io/localai/localai](https://github.com/mudler/LocalAI) | minor | `v2.12.4-cublas-cuda12-core` -> `v2.13.0-cublas-cuda12-core` | | [docker.io/localai/localai](https://github.com/mudler/LocalAI) | minor | `v2.12.4-ffmpeg-core` -> `v2.13.0-ffmpeg-core` | | [docker.io/localai/localai](https://github.com/mudler/LocalAI) | minor | `v2.12.4` -> `v2.13.0` | --- > [!WARNING] > Some dependencies could not be looked up. Check the Dependency Dashboard for more information. --- ### Release Notes <details> <summary>mudler/LocalAI (docker.io/localai/localai)</summary> ### [`v2.13.0`](https://github.com/mudler/LocalAI/releases/tag/v2.13.0): 🖼️ v2.13.0 - Model gallery edition [Compare Source](https://github.com/mudler/LocalAI/compare/v2.12.4...v2.13.0) Hello folks, Ettore here - I'm happy to announce the v2.13.0 LocalAI release is out, with many features! Below there is a small breakdown of the hottest features introduced in this release - however - there are many other improvements (especially from the community) as well, so don't miss out the changelog! Check out the full changelog below for having an overview of all the changes that went in this release (this one is quite packed up). ##### 🖼️ Model gallery This is the first release with model gallery in the webUI, you can see now a "Model" button in the WebUI which lands now in a selection of models: ![output](https://github.com/mudler/LocalAI/assets/2420543/7b16676e-d5b1-4c97-89bd-9fa5065c21ad) You can choose now models between stablediffusion, llama3, tts, embeddings and more! The gallery is growing steadly and being kept up-to-date. The models are simple YAML files which are hosted in this repository: https://github.com/mudler/LocalAI/tree/master/gallery - you can host your own repository with your model index, or if you want you can contribute to LocalAI. If you want to contribute adding models, you can by opening up a PR in the `gallery` directory: https://github.com/mudler/LocalAI/tree/master/gallery. ##### Rerankers I'm excited to introduce a new backend for `rerankers`. LocalAI now implements the Jina API (https://jina.ai/reranker/#apiform) as a compatibility layer, and you can use existing Jina clients and point to those to the LocalAI address. Behind the hoods, uses https://github.com/AnswerDotAI/rerankers. ![output](https://github.com/mudler/LocalAI/assets/2420543/ede67b25-fac4-4833-ae4f-78290e401e60) You can test this by using container images with python (this does **NOT** work with `core` images) and a model config file like this, or by installing `cross-encoder` from the gallery in the UI: ```yaml name: jina-reranker-v1-base-en backend: rerankers parameters: model: cross-encoder ``` and test it with: ```bash curl http://localhost:8080/v1/rerank \ -H "Content-Type: application/json" \ -d '{ "model": "jina-reranker-v1-base-en", "query": "Organic skincare products for sensitive skin", "documents": [ "Eco-friendly kitchenware for modern homes", "Biodegradable cleaning supplies for eco-conscious consumers", "Organic cotton baby clothes for sensitive skin", "Natural organic skincare range for sensitive skin", "Tech gadgets for smart homes: 2024 edition", "Sustainable gardening tools and compost solutions", "Sensitive skin-friendly facial cleansers and toners", "Organic food wraps and storage solutions", "All-natural pet food for dogs with allergies", "Yoga mats made from recycled materials" ], "top_n": 3 }' ``` ##### Parler-tts There is a new backend available for tts now, `parler-tts`. It is possible to install and configure the model directly from the gallery. https://github.com/huggingface/parler-tts ##### 🎈 Lot of small improvements behind the scenes! Thanks to our outstanding community, we have enhanced the performance and stability of LocalAI across various modules. From backend optimizations to front-end adjustments, every tweak helps make LocalAI smoother and more robust. ##### 📣 Spread the word! First off, a massive thank you (again!) to each and every one of you who've chipped in to squash bugs and suggest cool new features for LocalAI. Your help, kind words, and brilliant ideas are truly appreciated - more than words can say! And to those of you who've been heros, giving up your own time to help out fellow users on Discord and in our repo, you're absolutely amazing. We couldn't have asked for a better community. Just so you know, LocalAI doesn't have the luxury of big corporate sponsors behind it. It's all us, folks. So, if you've found value in what we're building together and want to keep the momentum going, consider showing your support. A little shoutout on your favorite social platforms using @​LocalAI_OSS and @​mudler_it or joining our sponsors can make a big difference. Also, if you haven't yet joined our Discord, come on over! Here's the link: https://discord.gg/uJAeKSAGDy Every bit of support, every mention, and every star adds up and helps us keep this ship sailing. Let's keep making LocalAI awesome together! Thanks a ton, and here's to more exciting times ahead with LocalAI! ##### What's Changed ##### Bug fixes 🐛 - fix(autogptq): do not use_triton with qwen-vl by [@​thiner](https://github.com/thiner) in [https://github.com/mudler/LocalAI/pull/1985](https://github.com/mudler/LocalAI/pull/1985) - fix: respect concurrency from parent build parameters when building GRPC by [@​cryptk](https://github.com/cryptk) in [https://github.com/mudler/LocalAI/pull/2023](https://github.com/mudler/LocalAI/pull/2023) - ci: fix release pipeline missing dependencies by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/2025](https://github.com/mudler/LocalAI/pull/2025) - fix: remove build path from help text documentation by [@​cryptk](https://github.com/cryptk) in [https://github.com/mudler/LocalAI/pull/2037](https://github.com/mudler/LocalAI/pull/2037) - fix: previous CLI rework broke debug logging by [@​cryptk](https://github.com/cryptk) in [https://github.com/mudler/LocalAI/pull/2036](https://github.com/mudler/LocalAI/pull/2036) - fix(fncall): fix regression introduced in [#​1963](https://github.com/mudler/LocalAI/issues/1963) by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/2048](https://github.com/mudler/LocalAI/pull/2048) - fix: adjust some sources names to match the naming of their repositories by [@​cryptk](https://github.com/cryptk) in [https://github.com/mudler/LocalAI/pull/2061](https://github.com/mudler/LocalAI/pull/2061) - fix: move the GRPC cache generation workflow into it's own concurrency group by [@​cryptk](https://github.com/cryptk) in [https://github.com/mudler/LocalAI/pull/2071](https://github.com/mudler/LocalAI/pull/2071) - fix(llama.cpp): set -1 as default for max tokens by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/2087](https://github.com/mudler/LocalAI/pull/2087) - fix(llama.cpp-ggml): fixup `max_tokens` for old backend by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/2094](https://github.com/mudler/LocalAI/pull/2094) - fix missing TrustRemoteCode in OpenVINO model load by [@​fakezeta](https://github.com/fakezeta) in [https://github.com/mudler/LocalAI/pull/2114](https://github.com/mudler/LocalAI/pull/2114) - Incl ocv pkg for diffsusers utils by [@​jtwolfe](https://github.com/jtwolfe) in [https://github.com/mudler/LocalAI/pull/2115](https://github.com/mudler/LocalAI/pull/2115) ##### Exciting New Features 🎉 - feat: kong cli refactor fixes [#​1955](https://github.com/mudler/LocalAI/issues/1955) by [@​cryptk](https://github.com/cryptk) in [https://github.com/mudler/LocalAI/pull/1974](https://github.com/mudler/LocalAI/pull/1974) - feat: add flash-attn in nvidia and rocm envs by [@​golgeek](https://github.com/golgeek) in [https://github.com/mudler/LocalAI/pull/1995](https://github.com/mudler/LocalAI/pull/1995) - feat: use tokenizer.apply_chat_template() in vLLM by [@​golgeek](https://github.com/golgeek) in [https://github.com/mudler/LocalAI/pull/1990](https://github.com/mudler/LocalAI/pull/1990) - feat(gallery): support ConfigURLs by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/2012](https://github.com/mudler/LocalAI/pull/2012) - fix: dont commit generated files to git by [@​cryptk](https://github.com/cryptk) in [https://github.com/mudler/LocalAI/pull/1993](https://github.com/mudler/LocalAI/pull/1993) - feat(parler-tts): Add new backend by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/2027](https://github.com/mudler/LocalAI/pull/2027) - feat(grpc): return consumed token count and update response accordingly by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/2035](https://github.com/mudler/LocalAI/pull/2035) - feat(store): add Golang client by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/1977](https://github.com/mudler/LocalAI/pull/1977) - feat(functions): support models with no grammar, add tests by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/2068](https://github.com/mudler/LocalAI/pull/2068) - refactor(template): isolate and add tests by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/2069](https://github.com/mudler/LocalAI/pull/2069) - feat: fiber logs with zerlog and add trace level by [@​cryptk](https://github.com/cryptk) in [https://github.com/mudler/LocalAI/pull/2082](https://github.com/mudler/LocalAI/pull/2082) - models(gallery): add gallery by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/2078](https://github.com/mudler/LocalAI/pull/2078) - Add tensor_parallel_size setting to vllm setting items by [@​Taikono-Himazin](https://github.com/Taikono-Himazin) in [https://github.com/mudler/LocalAI/pull/2085](https://github.com/mudler/LocalAI/pull/2085) - Transformer Backend: Implementing use_tokenizer_template and stop_prompts options by [@​fakezeta](https://github.com/fakezeta) in [https://github.com/mudler/LocalAI/pull/2090](https://github.com/mudler/LocalAI/pull/2090) - feat: Galleries UI by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/2104](https://github.com/mudler/LocalAI/pull/2104) - Transformers Backend: max_tokens adherence to OpenAI API by [@​fakezeta](https://github.com/fakezeta) in [https://github.com/mudler/LocalAI/pull/2108](https://github.com/mudler/LocalAI/pull/2108) - Fix cleanup sonarqube findings by [@​cryptk](https://github.com/cryptk) in [https://github.com/mudler/LocalAI/pull/2106](https://github.com/mudler/LocalAI/pull/2106) - feat(models-ui): minor visual enhancements by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/2109](https://github.com/mudler/LocalAI/pull/2109) - fix(gallery): show a fake image if no there is no icon by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/2111](https://github.com/mudler/LocalAI/pull/2111) - feat(rerankers): Add new backend, support jina rerankers API by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/2121](https://github.com/mudler/LocalAI/pull/2121) ##### 🧠 Models - models(llama3): add llama3 to embedded models by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/2074](https://github.com/mudler/LocalAI/pull/2074) - feat(gallery): add llama3, hermes, phi-3, and others by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/2110](https://github.com/mudler/LocalAI/pull/2110) - models(gallery): add new models to the gallery by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/2124](https://github.com/mudler/LocalAI/pull/2124) - models(gallery): add more models by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/2129](https://github.com/mudler/LocalAI/pull/2129) ##### 📖 Documentation and examples - ⬆️ Update docs version mudler/LocalAI by [@​localai-bot](https://github.com/localai-bot) in [https://github.com/mudler/LocalAI/pull/1988](https://github.com/mudler/LocalAI/pull/1988) - docs: fix stores link by [@​adrienbrault](https://github.com/adrienbrault) in [https://github.com/mudler/LocalAI/pull/2044](https://github.com/mudler/LocalAI/pull/2044) - AMD/ROCm Documentation update + formatting fix by [@​jtwolfe](https://github.com/jtwolfe) in [https://github.com/mudler/LocalAI/pull/2100](https://github.com/mudler/LocalAI/pull/2100) ##### 👒 Dependencies - deps: Update version of vLLM to add support of Cohere Command_R model in vLLM inference by [@​holyCowMp3](https://github.com/holyCowMp3) in [https://github.com/mudler/LocalAI/pull/1975](https://github.com/mudler/LocalAI/pull/1975) - ⬆️ Update ggerganov/llama.cpp by [@​localai-bot](https://github.com/localai-bot) in [https://github.com/mudler/LocalAI/pull/1991](https://github.com/mudler/LocalAI/pull/1991) - build(deps): bump google.golang.org/protobuf from 1.31.0 to 1.33.0 by [@​dependabot](https://github.com/dependabot) in [https://github.com/mudler/LocalAI/pull/1998](https://github.com/mudler/LocalAI/pull/1998) - build(deps): bump github.com/docker/docker from 20.10.7+incompatible to 24.0.9+incompatible by [@​dependabot](https://github.com/dependabot) in [https://github.com/mudler/LocalAI/pull/1999](https://github.com/mudler/LocalAI/pull/1999) - build(deps): bump github.com/gofiber/fiber/v2 from 2.52.0 to 2.52.1 by [@​dependabot](https://github.com/dependabot) in [https://github.com/mudler/LocalAI/pull/2001](https://github.com/mudler/LocalAI/pull/2001) - build(deps): bump actions/checkout from 3 to 4 by [@​dependabot](https://github.com/dependabot) in [https://github.com/mudler/LocalAI/pull/2002](https://github.com/mudler/LocalAI/pull/2002) - build(deps): bump actions/setup-go from 4 to 5 by [@​dependabot](https://github.com/dependabot) in [https://github.com/mudler/LocalAI/pull/2003](https://github.com/mudler/LocalAI/pull/2003) - build(deps): bump peter-evans/create-pull-request from 5 to 6 by [@​dependabot](https://github.com/dependabot) in [https://github.com/mudler/LocalAI/pull/2005](https://github.com/mudler/LocalAI/pull/2005) - build(deps): bump actions/cache from 3 to 4 by [@​dependabot](https://github.com/dependabot) in [https://github.com/mudler/LocalAI/pull/2006](https://github.com/mudler/LocalAI/pull/2006) - build(deps): bump actions/upload-artifact from 3 to 4 by [@​dependabot](https://github.com/dependabot) in [https://github.com/mudler/LocalAI/pull/2007](https://github.com/mudler/LocalAI/pull/2007) - build(deps): bump github.com/charmbracelet/glamour from 0.6.0 to 0.7.0 by [@​dependabot](https://github.com/dependabot) in [https://github.com/mudler/LocalAI/pull/2004](https://github.com/mudler/LocalAI/pull/2004) - build(deps): bump github.com/gofiber/fiber/v2 from 2.52.0 to 2.52.4 by [@​dependabot](https://github.com/dependabot) in [https://github.com/mudler/LocalAI/pull/2008](https://github.com/mudler/LocalAI/pull/2008) - build(deps): bump github.com/opencontainers/runc from 1.1.5 to 1.1.12 by [@​dependabot](https://github.com/dependabot) in [https://github.com/mudler/LocalAI/pull/2000](https://github.com/mudler/LocalAI/pull/2000) - ⬆️ Update ggerganov/llama.cpp by [@​localai-bot](https://github.com/localai-bot) in [https://github.com/mudler/LocalAI/pull/2014](https://github.com/mudler/LocalAI/pull/2014) - build(deps): bump the pip group across 4 directories with 8 updates by [@​dependabot](https://github.com/dependabot) in [https://github.com/mudler/LocalAI/pull/2017](https://github.com/mudler/LocalAI/pull/2017) - build(deps): bump follow-redirects from 1.15.2 to 1.15.6 in /examples/langchain/langchainjs-localai-example by [@​dependabot](https://github.com/dependabot) in [https://github.com/mudler/LocalAI/pull/2020](https://github.com/mudler/LocalAI/pull/2020) - ⬆️ Update ggerganov/llama.cpp by [@​localai-bot](https://github.com/localai-bot) in [https://github.com/mudler/LocalAI/pull/2024](https://github.com/mudler/LocalAI/pull/2024) - build(deps): bump softprops/action-gh-release from 1 to 2 by [@​dependabot](https://github.com/dependabot) in [https://github.com/mudler/LocalAI/pull/2039](https://github.com/mudler/LocalAI/pull/2039) - build(deps): bump dependabot/fetch-metadata from 1.3.4 to 2.0.0 by [@​dependabot](https://github.com/dependabot) in [https://github.com/mudler/LocalAI/pull/2040](https://github.com/mudler/LocalAI/pull/2040) - build(deps): bump github/codeql-action from 2 to 3 by [@​dependabot](https://github.com/dependabot) in [https://github.com/mudler/LocalAI/pull/2041](https://github.com/mudler/LocalAI/pull/2041) - ⬆️ Update ggerganov/llama.cpp by [@​localai-bot](https://github.com/localai-bot) in [https://github.com/mudler/LocalAI/pull/2043](https://github.com/mudler/LocalAI/pull/2043) - ⬆️ Update ggerganov/whisper.cpp by [@​localai-bot](https://github.com/localai-bot) in [https://github.com/mudler/LocalAI/pull/2042](https://github.com/mudler/LocalAI/pull/2042) - build(deps): bump the pip group across 4 directories with 8 updates by [@​dependabot](https://github.com/dependabot) in [https://github.com/mudler/LocalAI/pull/2049](https://github.com/mudler/LocalAI/pull/2049) - ⬆️ Update ggerganov/whisper.cpp by [@​localai-bot](https://github.com/localai-bot) in [https://github.com/mudler/LocalAI/pull/2050](https://github.com/mudler/LocalAI/pull/2050) - ⬆️ Update ggerganov/whisper.cpp by [@​localai-bot](https://github.com/localai-bot) in [https://github.com/mudler/LocalAI/pull/2060](https://github.com/mudler/LocalAI/pull/2060) - build(deps): bump aiohttp from 3.9.2 to 3.9.4 in /examples/langchain/langchainpy-localai-example in the pip group across 1 directory by [@​dependabot](https://github.com/dependabot) in [https://github.com/mudler/LocalAI/pull/2067](https://github.com/mudler/LocalAI/pull/2067) - ⬆️ Update ggerganov/llama.cpp by [@​localai-bot](https://github.com/localai-bot) in [https://github.com/mudler/LocalAI/pull/2089](https://github.com/mudler/LocalAI/pull/2089) - deps(llama.cpp): update, use better model for function call tests by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/2119](https://github.com/mudler/LocalAI/pull/2119) - ⬆️ Update ggerganov/whisper.cpp by [@​localai-bot](https://github.com/localai-bot) in [https://github.com/mudler/LocalAI/pull/2122](https://github.com/mudler/LocalAI/pull/2122) - ⬆️ Update ggerganov/llama.cpp by [@​localai-bot](https://github.com/localai-bot) in [https://github.com/mudler/LocalAI/pull/2123](https://github.com/mudler/LocalAI/pull/2123) - build(deps): bump pydantic from 1.10.7 to 1.10.13 in /examples/langchain/langchainpy-localai-example in the pip group across 1 directory by [@​dependabot](https://github.com/dependabot) in [https://github.com/mudler/LocalAI/pull/2125](https://github.com/mudler/LocalAI/pull/2125) - feat(swagger): update swagger by [@​localai-bot](https://github.com/localai-bot) in [https://github.com/mudler/LocalAI/pull/2128](https://github.com/mudler/LocalAI/pull/2128) ##### Other Changes - ci: try to build on macos14 by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/2011](https://github.com/mudler/LocalAI/pull/2011) - ⬆️ Update docs version mudler/LocalAI by [@​localai-bot](https://github.com/localai-bot) in [https://github.com/mudler/LocalAI/pull/2013](https://github.com/mudler/LocalAI/pull/2013) - refactor: backend/service split, channel-based llm flow by [@​dave-gray101](https://github.com/dave-gray101) in [https://github.com/mudler/LocalAI/pull/1963](https://github.com/mudler/LocalAI/pull/1963) - ⬆️ Update ggerganov/llama.cpp by [@​localai-bot](https://github.com/localai-bot) in [https://github.com/mudler/LocalAI/pull/2028](https://github.com/mudler/LocalAI/pull/2028) - fix - correct checkout versions by [@​dave-gray101](https://github.com/dave-gray101) in [https://github.com/mudler/LocalAI/pull/2029](https://github.com/mudler/LocalAI/pull/2029) - Revert "build(deps): bump the pip group across 4 directories with 8 updates" by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/2030](https://github.com/mudler/LocalAI/pull/2030) - ⬆️ Update docs version mudler/LocalAI by [@​localai-bot](https://github.com/localai-bot) in [https://github.com/mudler/LocalAI/pull/2032](https://github.com/mudler/LocalAI/pull/2032) - ⬆️ Update ggerganov/llama.cpp by [@​localai-bot](https://github.com/localai-bot) in [https://github.com/mudler/LocalAI/pull/2033](https://github.com/mudler/LocalAI/pull/2033) - fix: action-tmate back to upstream, dead code removal by [@​dave-gray101](https://github.com/dave-gray101) in [https://github.com/mudler/LocalAI/pull/2038](https://github.com/mudler/LocalAI/pull/2038) - Revert [#​1963](https://github.com/mudler/LocalAI/issues/1963) by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/2056](https://github.com/mudler/LocalAI/pull/2056) - feat: refactor the dynamic json configs for api_keys and external_backends by [@​cryptk](https://github.com/cryptk) in [https://github.com/mudler/LocalAI/pull/2055](https://github.com/mudler/LocalAI/pull/2055) - tests: add template tests by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/2063](https://github.com/mudler/LocalAI/pull/2063) - feat: better control of GRPC docker cache by [@​cryptk](https://github.com/cryptk) in [https://github.com/mudler/LocalAI/pull/2070](https://github.com/mudler/LocalAI/pull/2070) - ⬆️ Update ggerganov/llama.cpp by [@​localai-bot](https://github.com/localai-bot) in [https://github.com/mudler/LocalAI/pull/2051](https://github.com/mudler/LocalAI/pull/2051) - ⬆️ Update ggerganov/llama.cpp by [@​localai-bot](https://github.com/localai-bot) in [https://github.com/mudler/LocalAI/pull/2080](https://github.com/mudler/LocalAI/pull/2080) - feat: enable polling configs for systems with broken fsnotify (docker volumes on windows) by [@​cryptk](https://github.com/cryptk) in [https://github.com/mudler/LocalAI/pull/2081](https://github.com/mudler/LocalAI/pull/2081) - fix: action-tmate: use connect-timeout-sections and limit-access-to-actor by [@​dave-gray101](https://github.com/dave-gray101) in [https://github.com/mudler/LocalAI/pull/2083](https://github.com/mudler/LocalAI/pull/2083) - refactor(routes): split routes registration by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/2077](https://github.com/mudler/LocalAI/pull/2077) - fix: action-tmate detached by [@​dave-gray101](https://github.com/dave-gray101) in [https://github.com/mudler/LocalAI/pull/2092](https://github.com/mudler/LocalAI/pull/2092) - fix: rename fiber entrypoint from http/api to http/app by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/2096](https://github.com/mudler/LocalAI/pull/2096) - fix: typo in models.go by [@​eltociear](https://github.com/eltociear) in [https://github.com/mudler/LocalAI/pull/2099](https://github.com/mudler/LocalAI/pull/2099) - Update text-generation.md by [@​Taikono-Himazin](https://github.com/Taikono-Himazin) in [https://github.com/mudler/LocalAI/pull/2095](https://github.com/mudler/LocalAI/pull/2095) - ⬆️ Update docs version mudler/LocalAI by [@​localai-bot](https://github.com/localai-bot) in [https://github.com/mudler/LocalAI/pull/2105](https://github.com/mudler/LocalAI/pull/2105) - ⬆️ Update docs version mudler/LocalAI by [@​localai-bot](https://github.com/localai-bot) in [https://github.com/mudler/LocalAI/pull/2113](https://github.com/mudler/LocalAI/pull/2113) ##### New Contributors - [@​holyCowMp3](https://github.com/holyCowMp3) made their first contribution in [https://github.com/mudler/LocalAI/pull/1975](https://github.com/mudler/LocalAI/pull/1975) - [@​dependabot](https://github.com/dependabot) made their first contribution in [https://github.com/mudler/LocalAI/pull/1998](https://github.com/mudler/LocalAI/pull/1998) - [@​adrienbrault](https://github.com/adrienbrault) made their first contribution in [https://github.com/mudler/LocalAI/pull/2044](https://github.com/mudler/LocalAI/pull/2044) - [@​Taikono-Himazin](https://github.com/Taikono-Himazin) made their first contribution in [https://github.com/mudler/LocalAI/pull/2085](https://github.com/mudler/LocalAI/pull/2085) - [@​eltociear](https://github.com/eltociear) made their first contribution in [https://github.com/mudler/LocalAI/pull/2099](https://github.com/mudler/LocalAI/pull/2099) - [@​jtwolfe](https://github.com/jtwolfe) made their first contribution in [https://github.com/mudler/LocalAI/pull/2100](https://github.com/mudler/LocalAI/pull/2100) **Full Changelog**: mudler/LocalAI@v2.12.4...V2.13.0 </details> --- ### Configuration 📅 **Schedule**: Branch creation - At any time (no schedule defined), Automerge - At any time (no schedule defined). 🚦 **Automerge**: Enabled. ♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox. 🔕 **Ignore**: Close this PR and you won't be reminded about these updates again. --- - [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check this box --- This PR has been generated by [Renovate Bot](https://github.com/renovatebot/renovate). <!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzNy4zMjUuMSIsInVwZGF0ZWRJblZlciI6IjM3LjMyNS4xIiwidGFyZ2V0QnJhbmNoIjoibWFzdGVyIiwibGFiZWxzIjpbImF1dG9tZXJnZSIsInVwZGF0ZS9kb2NrZXIvZ2VuZXJhbC9ub24tbWFqb3IiXX0=-->
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This PR adds a new backend,
parler-tts
(https://github.com/huggingface/parler-tts) which seems to have great quality overall.Sadly it requires a new environment:
https://github.com/descriptinc/audiotools/blob/7776c296c711db90176a63ff808c26e0ee087263/setup.py#L56C44-L56C100
protocolbuffers/protobuf#10051
Part of #1126