Skip to content

Commit

Permalink
feat: Add azure openai in README file
Browse files Browse the repository at this point in the history
Signed-off-by: Aris Boutselis <aris.boutselis@senseon.io>
  • Loading branch information
Aris Boutselis committed Apr 25, 2023
1 parent 876bbdb commit 4430854
Showing 1 changed file with 24 additions and 3 deletions.
27 changes: 24 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -313,8 +313,31 @@ _Analysis with serve mode_
```
curl -X GET "http://localhost:8080/analyze?namespace=k8sgpt&explain=false"
```
</details>

## Additional AI providers

### Azure OpenAI
<em>Prerequisites:</em> an Azure OpenAI deployment is needed, please visit MS official [documentation](https://learn.microsoft.com/en-us/azure/cognitive-services/openai/how-to/create-resource?pivots=web-portal#create-a-resource) to create your own.

To authenticate with k8sgpt, you will need the Azure OpenAI endpoint of your tenant `"https://your Azure OpenAI Endpoint"`, the api key to access your deployment, the deployment name of your model and the model name itself.
<details>

### Run k8sgpt
To run k8sgpt, run `k8sgpt auth` with the `azureopenai` backend:
```
k8sgpt auth --backend azureopenai --baseurl https://<your Azure OpenAI endpoint> --engine <deployment_name> --model <model_name>
```
Lastly, enter your Azure API key, after the prompt.

## Running local models
Now you are ready to analyze with the azure openai backend:
```
k8sgpt analyze --explain --backend azureopenai
```

</details>

### Running local models

To run local models, it is possible to use OpenAI compatible APIs, for instance [LocalAI](https://github.com/go-skynet/LocalAI) which uses [llama.cpp](https://github.com/ggerganov/llama.cpp) and [ggml](https://github.com/ggerganov/ggml) to run inference on consumer-grade hardware. Models supported by LocalAI for instance are Vicuna, Alpaca, LLaMA, Cerebras, GPT4ALL, GPT4ALL-J and koala.

Expand All @@ -334,8 +357,6 @@ To run k8sgpt, run `k8sgpt auth` with the `localai` backend:
k8sgpt auth --backend localai --model <model_name> --baseurl http://localhost:8080/v1
```

When being asked for an API key, just press enter.

Now you can analyze with the `localai` backend:

```
Expand Down

0 comments on commit 4430854

Please sign in to comment.