Skip to content

Commit

Permalink
Merge pull request #269 from mudler/local_models
Browse files Browse the repository at this point in the history
feat: running local models
  • Loading branch information
Aris Boutselis committed Apr 25, 2023
2 parents 252c734 + 9f092f3 commit c365c53
Show file tree
Hide file tree
Showing 3 changed files with 41 additions and 1 deletion.
29 changes: 29 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -313,6 +313,35 @@ _Analysis with serve mode_
```
curl -X GET "http://localhost:8080/analyze?namespace=k8sgpt&explain=false"
```

## Running local models

To run local models, it is possible to use OpenAI compatible APIs, for instance [LocalAI](https://github.com/go-skynet/LocalAI) which uses [llama.cpp](https://github.com/ggerganov/llama.cpp) and [ggml](https://github.com/ggerganov/ggml) to run inference on consumer-grade hardware. Models supported by LocalAI for instance are Vicuna, Alpaca, LLaMA, Cerebras, GPT4ALL, GPT4ALL-J and koala.

<details>

To run local inference, you need to download the models first, for instance you can find `ggml` compatible models in [huggingface.com](https://huggingface.co/models?search=ggml) (for example vicuna, alpaca and koala).

### Start the API server

To start the API server, follow the instruction in [LocalAI](https://github.com/go-skynet/LocalAI#example-use-gpt4all-j-model).

### Run k8sgpt

To run k8sgpt, run `k8sgpt auth` with the `localai` backend:

```
k8sgpt auth --backend localai --model <model_name> --baseurl http://localhost:8080/v1
```

When being asked for an API key, just press enter.

Now you can analyze with the `localai` backend:

```
k8sgpt analyze --explain --backend localai
```

</details>

## Configuration
Expand Down
4 changes: 3 additions & 1 deletion pkg/ai/iai.go
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,8 @@ func NewClient(provider string) IAI {
switch provider {
case "openai":
return &OpenAIClient{}
case "localai":
return &LocalAIClient{}
case "noopai":
return &NoOpAIClient{}
default:
Expand All @@ -51,7 +53,7 @@ type AIProvider struct {
Name string `mapstructure:"name"`
Model string `mapstructure:"model"`
Password string `mapstructure:"password"`
BaseURL string `mapstructure:"base_url"`
BaseURL string `mapstructure:"baseurl"`
}

func (p *AIProvider) GetBaseURL() string {
Expand Down
9 changes: 9 additions & 0 deletions pkg/ai/localai.go
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
package ai

type LocalAIClient struct {
OpenAIClient
}

func (a *LocalAIClient) GetName() string {
return "localai"
}

0 comments on commit c365c53

Please sign in to comment.