diff --git a/README.md b/README.md index a82cd2dc5d14..a8e02ac6a38e 100644 --- a/README.md +++ b/README.md @@ -21,10 +21,11 @@ In a nutshell: - NO GPU required. NO Internet access is required either - Optional, GPU Acceleration is available in `llama.cpp`-compatible LLMs. See also the [build section](https://localai.io/basics/build/index.html). - Supports multiple models: - - πŸ“– Text generation with GPTs (`llama.cpp`, `gpt4all.cpp`, ... and more) - - πŸ—£ Text to Audio πŸŽΊπŸ†• - - πŸ”ˆ Audio to Text (Audio transcription with `whisper.cpp`) - - 🎨 Image generation with stable diffusion + - πŸ“– [Text generation with GPTs](https://localai.io/features/text-generation/) (`llama.cpp`, `gpt4all.cpp`, ... and more) + - πŸ—£ [Text to Audio](https://localai.io/features/text-to-audio/) + - πŸ”ˆ [Audio to Text](https://localai.io/features/audio-to-text/) (Audio transcription with `whisper.cpp`) + - 🎨 [Image generation with stable diffusion](https://localai.io/features/image-generation) + - πŸ”₯ [OpenAI functions](https://localai.io/features/openai-functions/) πŸ†• - πŸƒ Once loaded the first time, it keep models loaded in memory for faster inference - ⚑ Doesn't shell-out, but uses C++ bindings for a faster inference and better performance. @@ -51,14 +52,13 @@ See the [Getting started](https://localai.io/basics/getting_started/index.html) - [X] Enable automatic downloading of models from HuggingFace - [ ] Upstream our golang bindings to llama.cpp (https://github.com/ggerganov/llama.cpp/issues/351) - [ ] Enable gallery management directly from the webui. -- [ ] πŸ”₯ OpenAI functions: https://github.com/go-skynet/LocalAI/issues/588 +- [x] πŸ”₯ OpenAI functions: https://github.com/go-skynet/LocalAI/issues/588 ## News -- πŸ”₯πŸ”₯πŸ”₯ 28-06-2023: **v1.20.0**: Added text to audio and gallery huggingface repositories! [Release notes](https://localai.io/basics/news/index.html#-28-06-2023-__v1200__-) [Changelog](https://github.com/go-skynet/LocalAI/releases/tag/v1.20.0) -- πŸ”₯πŸ”₯πŸ”₯ 19-06-2023: **v1.19.0**: CUDA support! [Release notes](https://localai.io/basics/news/index.html#-19-06-2023-__v1190__-) [Changelog](https://github.com/go-skynet/LocalAI/releases/tag/v1.19.0) -- πŸ”₯πŸ”₯πŸ”₯ 06-06-2023: **v1.18.0**: Many updates, new features, and much more πŸš€, check out the [Release notes](https://localai.io/basics/news/index.html#-06-06-2023-__v1180__-)! -- 29-05-2023: LocalAI now has a website, [https://localai.io](https://localai.io)! check the news in the [dedicated section](https://localai.io/basics/news/index.html)! +Check the news and the release notes in the [dedicated section](https://localai.io/basics/news/index.html) + +- πŸ”₯πŸ”₯πŸ”₯ 23-07-2023: **v1.22.0**: LLaMa2, huggingface embeddings, and more ! [Changelog](https://github.com/go-skynet/LocalAI/releases/tag/v1.22.0) For latest news, follow also on Twitter [@LocalAI_API](https://twitter.com/LocalAI_API) and [@mudler_it](https://twitter.com/mudler_it)