From f8fa35cf9d591691679d6881fcc203e3411d99aa Mon Sep 17 00:00:00 2001 From: Harshit Mehta Date: Wed, 26 Apr 2023 13:59:22 +0530 Subject: [PATCH] docs: fix README (#345) Signed-off-by: Harshit Mehta Co-authored-by: Matthis <99146727+matthisholleville@users.noreply.github.com> --- README.md | 2 ++ 1 file changed, 2 insertions(+) diff --git a/README.md b/README.md index 88a91185b3..ebdd179960 100644 --- a/README.md +++ b/README.md @@ -311,6 +311,8 @@ _Analysis with serve mode_ curl -X GET "http://localhost:8080/analyze?namespace=k8sgpt&explain=false" ``` + + ## Running local models To run local models, it is possible to use OpenAI compatible APIs, for instance [LocalAI](https://github.com/go-skynet/LocalAI) which uses [llama.cpp](https://github.com/ggerganov/llama.cpp) and [ggml](https://github.com/ggerganov/ggml) to run inference on consumer-grade hardware. Models supported by LocalAI for instance are Vicuna, Alpaca, LLaMA, Cerebras, GPT4ALL, GPT4ALL-J and koala.