-
-
Notifications
You must be signed in to change notification settings - Fork 2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(ux): Add chat, tts, and image-gen pages to the WebUI #2222
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
✅ Deploy Preview for localai canceled.
|
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
small things to enhance here and there, but I think for a first pass we should be good! 🥳 |
truecharts-admin
referenced
this pull request
in truecharts/public
May 5, 2024
…4.0 by renovate (#21605) This PR contains the following updates: | Package | Update | Change | |---|---|---| | [docker.io/localai/localai](https://github.com/mudler/LocalAI) | minor | `v2.13.0-cublas-cuda11-ffmpeg-core` -> `v2.14.0-cublas-cuda11-ffmpeg-core` | | [docker.io/localai/localai](https://github.com/mudler/LocalAI) | minor | `v2.13.0-cublas-cuda11-core` -> `v2.14.0-cublas-cuda11-core` | | [docker.io/localai/localai](https://github.com/mudler/LocalAI) | minor | `v2.13.0-cublas-cuda12-ffmpeg-core` -> `v2.14.0-cublas-cuda12-ffmpeg-core` | | [docker.io/localai/localai](https://github.com/mudler/LocalAI) | minor | `v2.13.0-cublas-cuda12-core` -> `v2.14.0-cublas-cuda12-core` | | [docker.io/localai/localai](https://github.com/mudler/LocalAI) | minor | `v2.13.0-ffmpeg-core` -> `v2.14.0-ffmpeg-core` | | [docker.io/localai/localai](https://github.com/mudler/LocalAI) | minor | `v2.13.0` -> `v2.14.0` | --- > [!WARNING] > Some dependencies could not be looked up. Check the Dependency Dashboard for more information. --- ### Release Notes <details> <summary>mudler/LocalAI (docker.io/localai/localai)</summary> ### [`v2.14.0`](https://github.com/mudler/LocalAI/releases/tag/v2.14.0) [Compare Source](https://github.com/mudler/LocalAI/compare/v2.13.0...v2.14.0) ##### 🚀 AIO Image Update: llama3 has landed! We're excited to announce that our AIO image has been upgraded with the latest LLM model, llama3, enhancing our capabilities with more accurate and dynamic responses. Behind the scenes uses https://huggingface.co/NousResearch/Hermes-2-Pro-Llama-3-8B-GGUF which is ready for function call, yay! ##### 💬 WebUI enhancements: Updates in Chat, Image Generation, and TTS |Chat | TTS | Image gen | |------------|----------------|-------------| | ![chatui](https://github.com/mudler/LocalAI/assets/2420543/ff71ad02-841d-48a9-99a7-30f024ae3331) | ![ttsui](https://github.com/mudler/LocalAI/assets/2420543/0c137ba5-cb35-426d-ae5d-390679432cf0) | ![image](https://github.com/mudler/LocalAI/assets/2420543/88f8ef30-e06a-454f-b01a-08fcd6917188) | Our interfaces for Chat, Text-to-Speech (TTS), and Image Generation have finally landed. Enjoy streamlined and simple interactions thanks to the efforts of our team, led by [@​mudler](https://github.com/mudler), who have worked tirelessly to enhance your experience. The WebUI interface serves as a quick way to debug and assess models loaded in LocalAI - there is much to improve, but we have now a small, hackable interface! ##### 🖼️ Many new models in the model gallery! | ![local-ai-gallery](https://github.com/mudler/LocalAI/assets/2420543/06a06d3c-b91a-472b-892a-a1b69ddc8c56) | |------------| The model gallery has received a substantial upgrade with numerous new models, including Einstein v6.1, SOVL, and several specialized Llama3 iterations. These additions are designed to cater to a broader range of tasks , making LocalAI more versatile than ever. Kudos to [@​mudler](https://github.com/mudler) for spearheading these exciting updates - now you can select with a couple of click the model you like! ##### 🛠️ Robust Fixes and Optimizations This update brings a series of crucial bug fixes and security enhancements to ensure our platform remains secure and efficient. Special thanks to [@​dave-gray101](https://github.com/dave-gray101), [@​cryptk](https://github.com/cryptk), and [@​fakezeta](https://github.com/fakezeta) for their diligent work in rooting out and resolving these issues 🤗 ##### ✨ OpenVINO and more We're introducing OpenVINO acceleration, and many OpenVINO models in the gallery. You can now enjoy fast-as-hell speed on Intel CPU and GPUs. Applause to [@​fakezeta](https://github.com/fakezeta) for the contributions! ##### 📚 Documentation and Dependency Upgrades We've updated our documentation and dependencies to keep you equipped with the latest tools and knowledge. These updates ensure that LocalAI remains a robust and dependable platform. ##### 👥 A Community Effort A special shout-out to our new contributors, [@​QuinnPiers](https://github.com/QuinnPiers) and [@​LeonSijiaLu](https://github.com/LeonSijiaLu), who have enriched our community with their first contributions. Welcome aboard, and thank you for your dedication and fresh insights! Each update in this release not only enhances our platform's capabilities but also ensures a safer and more user-friendly experience. We are excited to see how our users leverage these new features in their projects, freel free to hit a line on Twitter or in any other social, we'd be happy to hear how you use LocalAI! ##### 📣 Spread the word! First off, a massive thank you (again!) to each and every one of you who've chipped in to squash bugs and suggest cool new features for LocalAI. Your help, kind words, and brilliant ideas are truly appreciated - more than words can say! And to those of you who've been heros, giving up your own time to help out fellow users on Discord and in our repo, you're absolutely amazing. We couldn't have asked for a better community. Just so you know, LocalAI doesn't have the luxury of big corporate sponsors behind it. It's all us, folks. So, if you've found value in what we're building together and want to keep the momentum going, consider showing your support. A little shoutout on your favorite social platforms using @​LocalAI_OSS and @​mudler_it or joining our sponsors can make a big difference. Also, if you haven't yet joined our Discord, come on over! Here's the link: https://discord.gg/uJAeKSAGDy Every bit of support, every mention, and every star adds up and helps us keep this ship sailing. Let's keep making LocalAI awesome together! Thanks a ton, and.. exciting times ahead with LocalAI! ##### What's Changed ##### Bug fixes 🐛 - fix: `config_file_watcher.go` - root all file reads for safety by [@​dave-gray101](https://github.com/dave-gray101) in [https://github.com/mudler/LocalAI/pull/2144](https://github.com/mudler/LocalAI/pull/2144) - fix: github bump_docs.sh regex to drop emoji and other text by [@​dave-gray101](https://github.com/dave-gray101) in [https://github.com/mudler/LocalAI/pull/2180](https://github.com/mudler/LocalAI/pull/2180) - fix: undefined symbol: iJIT_NotifyEvent in import torch #[#​2153](https://github.com/mudler/LocalAI/issues/2153) by [@​fakezeta](https://github.com/fakezeta) in [https://github.com/mudler/LocalAI/pull/2179](https://github.com/mudler/LocalAI/pull/2179) - fix: security scanner warning noise: error handlers part 2 by [@​dave-gray101](https://github.com/dave-gray101) in [https://github.com/mudler/LocalAI/pull/2145](https://github.com/mudler/LocalAI/pull/2145) - fix: ensure GNUMake jobserver is passed through to whisper.cpp build by [@​cryptk](https://github.com/cryptk) in [https://github.com/mudler/LocalAI/pull/2187](https://github.com/mudler/LocalAI/pull/2187) - fix: bring everything onto the same GRPC version to fix tests by [@​cryptk](https://github.com/cryptk) in [https://github.com/mudler/LocalAI/pull/2199](https://github.com/mudler/LocalAI/pull/2199) ##### Exciting New Features 🎉 - feat(gallery): display job status also during navigation by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/2151](https://github.com/mudler/LocalAI/pull/2151) - feat: cleanup Dockerfile and make final image a little smaller by [@​cryptk](https://github.com/cryptk) in [https://github.com/mudler/LocalAI/pull/2146](https://github.com/mudler/LocalAI/pull/2146) - fix: swap to WHISPER_CUDA per deprecation message from whisper.cpp by [@​cryptk](https://github.com/cryptk) in [https://github.com/mudler/LocalAI/pull/2170](https://github.com/mudler/LocalAI/pull/2170) - feat: only keep the build artifacts from the grpc build by [@​cryptk](https://github.com/cryptk) in [https://github.com/mudler/LocalAI/pull/2172](https://github.com/mudler/LocalAI/pull/2172) - feat(gallery): support model deletion by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/2173](https://github.com/mudler/LocalAI/pull/2173) - refactor(application): introduce application global state by [@​dave-gray101](https://github.com/dave-gray101) in [https://github.com/mudler/LocalAI/pull/2072](https://github.com/mudler/LocalAI/pull/2072) - feat: organize Dockerfile into distinct sections by [@​cryptk](https://github.com/cryptk) in [https://github.com/mudler/LocalAI/pull/2181](https://github.com/mudler/LocalAI/pull/2181) - feat: OpenVINO acceleration for embeddings in transformer backend by [@​fakezeta](https://github.com/fakezeta) in [https://github.com/mudler/LocalAI/pull/2190](https://github.com/mudler/LocalAI/pull/2190) - chore: update go-stablediffusion to latest commit with Make jobserver fix by [@​cryptk](https://github.com/cryptk) in [https://github.com/mudler/LocalAI/pull/2197](https://github.com/mudler/LocalAI/pull/2197) - feat: user defined inference device for CUDA and OpenVINO by [@​fakezeta](https://github.com/fakezeta) in [https://github.com/mudler/LocalAI/pull/2212](https://github.com/mudler/LocalAI/pull/2212) - feat(ux): Add chat, tts, and image-gen pages to the WebUI by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/2222](https://github.com/mudler/LocalAI/pull/2222) - feat(aio): switch to llama3-based for LLM by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/2225](https://github.com/mudler/LocalAI/pull/2225) - feat(ui): support multilineand style `ul` by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/2226](https://github.com/mudler/LocalAI/pull/2226) ##### 🧠 Models - models(gallery): add Einstein v6.1 by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/2152](https://github.com/mudler/LocalAI/pull/2152) - models(gallery): add SOVL by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/2154](https://github.com/mudler/LocalAI/pull/2154) - models(gallery): add average_normie by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/2155](https://github.com/mudler/LocalAI/pull/2155) - models(gallery): add solana by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/2157](https://github.com/mudler/LocalAI/pull/2157) - models(gallery): add poppy porpoise by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/2158](https://github.com/mudler/LocalAI/pull/2158) - models(gallery): add Undi95/Llama-3-LewdPlay-8B-evo-GGUF by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/2160](https://github.com/mudler/LocalAI/pull/2160) - models(gallery): add biomistral-7b by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/2161](https://github.com/mudler/LocalAI/pull/2161) - models(gallery): add llama3-32k by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/2183](https://github.com/mudler/LocalAI/pull/2183) - models(gallery): add openvino models by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/2184](https://github.com/mudler/LocalAI/pull/2184) - models(gallery): add lexifun by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/2193](https://github.com/mudler/LocalAI/pull/2193) - models(gallery): add suzume-llama-3-8B-multilingual-gguf by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/2194](https://github.com/mudler/LocalAI/pull/2194) - models(gallery): add guillaumetell by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/2195](https://github.com/mudler/LocalAI/pull/2195) - models(gallery): add wizardlm2 by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/2209](https://github.com/mudler/LocalAI/pull/2209) - models(gallery): Add Hermes-2-Pro-Llama-3-8B-GGUF by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/2218](https://github.com/mudler/LocalAI/pull/2218) ##### 📖 Documentation and examples - ⬆️ Update docs version mudler/LocalAI by [@​localai-bot](https://github.com/localai-bot) in [https://github.com/mudler/LocalAI/pull/2149](https://github.com/mudler/LocalAI/pull/2149) - draft:Update model-gallery.md with correct gallery file by [@​QuinnPiers](https://github.com/QuinnPiers) in [https://github.com/mudler/LocalAI/pull/2163](https://github.com/mudler/LocalAI/pull/2163) - docs: update gallery, add rerankers by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/2166](https://github.com/mudler/LocalAI/pull/2166) - docs: enhance and condense few sections by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/2178](https://github.com/mudler/LocalAI/pull/2178) - \[Documentations] Removed invalid numberings from `troubleshooting mac` by [@​LeonSijiaLu](https://github.com/LeonSijiaLu) in [https://github.com/mudler/LocalAI/pull/2174](https://github.com/mudler/LocalAI/pull/2174) ##### 👒 Dependencies - ⬆️ Update ggerganov/llama.cpp by [@​localai-bot](https://github.com/localai-bot) in [https://github.com/mudler/LocalAI/pull/2150](https://github.com/mudler/LocalAI/pull/2150) - ⬆️ Update ggerganov/llama.cpp by [@​localai-bot](https://github.com/localai-bot) in [https://github.com/mudler/LocalAI/pull/2159](https://github.com/mudler/LocalAI/pull/2159) - ⬆️ Update ggerganov/llama.cpp by [@​localai-bot](https://github.com/localai-bot) in [https://github.com/mudler/LocalAI/pull/2176](https://github.com/mudler/LocalAI/pull/2176) - ⬆️ Update ggerganov/whisper.cpp by [@​localai-bot](https://github.com/localai-bot) in [https://github.com/mudler/LocalAI/pull/2177](https://github.com/mudler/LocalAI/pull/2177) - update go-tinydream to latest commit by [@​cryptk](https://github.com/cryptk) in [https://github.com/mudler/LocalAI/pull/2182](https://github.com/mudler/LocalAI/pull/2182) - build(deps): bump dependabot/fetch-metadata from 2.0.0 to 2.1.0 by [@​dependabot](https://github.com/dependabot) in [https://github.com/mudler/LocalAI/pull/2186](https://github.com/mudler/LocalAI/pull/2186) - ⬆️ Update ggerganov/llama.cpp by [@​localai-bot](https://github.com/localai-bot) in [https://github.com/mudler/LocalAI/pull/2189](https://github.com/mudler/LocalAI/pull/2189) - ⬆️ Update ggerganov/whisper.cpp by [@​localai-bot](https://github.com/localai-bot) in [https://github.com/mudler/LocalAI/pull/2188](https://github.com/mudler/LocalAI/pull/2188) - ⬆️ Update ggerganov/llama.cpp by [@​localai-bot](https://github.com/localai-bot) in [https://github.com/mudler/LocalAI/pull/2203](https://github.com/mudler/LocalAI/pull/2203) - ⬆️ Update ggerganov/llama.cpp by [@​localai-bot](https://github.com/localai-bot) in [https://github.com/mudler/LocalAI/pull/2213](https://github.com/mudler/LocalAI/pull/2213) ##### Other Changes - Revert ":arrow_up: Update docs version mudler/LocalAI" by [@​mudler](https://github.com/mudler) in [https://github.com/mudler/LocalAI/pull/2165](https://github.com/mudler/LocalAI/pull/2165) - Issue-1720: Updated `Build on mac` documentations by [@​LeonSijiaLu](https://github.com/LeonSijiaLu) in [https://github.com/mudler/LocalAI/pull/2171](https://github.com/mudler/LocalAI/pull/2171) - ⬆️ Update ggerganov/llama.cpp by [@​localai-bot](https://github.com/localai-bot) in [https://github.com/mudler/LocalAI/pull/2224](https://github.com/mudler/LocalAI/pull/2224) ##### New Contributors - [@​QuinnPiers](https://github.com/QuinnPiers) made their first contribution in [https://github.com/mudler/LocalAI/pull/2163](https://github.com/mudler/LocalAI/pull/2163) - [@​LeonSijiaLu](https://github.com/LeonSijiaLu) made their first contribution in [https://github.com/mudler/LocalAI/pull/2171](https://github.com/mudler/LocalAI/pull/2171) **Full Changelog**: mudler/LocalAI@v2.13.0...v2.14.0 </details> --- ### Configuration 📅 **Schedule**: Branch creation - At any time (no schedule defined), Automerge - At any time (no schedule defined). 🚦 **Automerge**: Enabled. ♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox. 🔕 **Ignore**: Close this PR and you won't be reminded about these updates again. --- - [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check this box --- This PR has been generated by [Renovate Bot](https://github.com/renovatebot/renovate). <!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzNy4zNDEuMCIsInVwZGF0ZWRJblZlciI6IjM3LjM0MS4wIiwidGFyZ2V0QnJhbmNoIjoibWFzdGVyIiwibGFiZWxzIjpbImF1dG9tZXJnZSIsInVwZGF0ZS9kb2NrZXIvZ2VuZXJhbC9ub24tbWFqb3IiXX0=-->
You are my hero |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Description
This PR is a result of my free time during the national holidays 😅
This is part of #2156
It adds to the WebUI a chat, an image generation and finally a TTS basic front end.
Chat
Image generation
TTS
Notes
DisableWelcome
toDisableWebUI
to disable all the UI routes instead. That allows to choose to disable the WebUI entirely in case it's not needed or it's not liked - making it LocalAI API-only (as it have always been!)