Skip to content

Commit

Permalink
docs: update readme
Browse files Browse the repository at this point in the history
  • Loading branch information
lukasrump committed Nov 21, 2024
1 parent 00e4612 commit 7d93f59
Showing 1 changed file with 1 addition and 53 deletions.
54 changes: 1 addition & 53 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,8 +25,6 @@ To get started with crllm, follow these simple installation steps:
- **ollama**: https://ollama.com/download
If you want to run the modells locally otherwise you will need the corresponding API keys for your provider



### Install from GitHub
```sh
pipx install git+https://github.com/lukasrump/crllm.git
Expand All @@ -44,57 +42,7 @@ CRLLM supports multiple backends for LLM code reviews. You can configure it by a
crllm -i .
```

This command guides you through the most important settings.
This TOML configuration file is splitted in four main sections:

### [project]
- **`description`**: Short project summary.

### [crllm]
- **`loader`**: Mechanism to load the source code, `"git"` by default.
- **`provider`**: LLM provider, `"ollama"`by default.
- **`git_main_branch`**: Specifies the main git branch, default is `"main"`.
- **`git_changed_lines`**: If `true`, only reviews changed lines.

#### Loaders
- **file**: Code review for a single source code file
- **git**: Reviews all changed files in the git repository
- **git_compare**: Reviews the difference between the current git branch and the `git_main_branch`

### [model]
The model settings depend on the provider. The model settings are the same as those of the [LangChain](https://python.langchain.com/docs/integrations/chat/) ChatModels. Per default crllm tries to use a locally installed ollama instance with llama3.1.

#### Ollama Local Setup
- **`model`**: Specifies the model to use, e.g `"llama3.1"`. Make sure that you pulled that model before you use it.

#### OpenAI API
- **`model`**: Specifies the model to use, e.g `"gpt-4o"`.

In addition you have to define the api key in your environment (`.env`)
```
OPENAI_API_KEY=your_openai_api_key
```
#### Hugging Face API
- **`repo_id`**: Specifies the repository to use, e.g `"HuggingFaceH4/zephyr-7b-beta"`.
- **`task`**: Specifies the task, e.g `"text-generation"`.

```
HUGGINGFACEHUB_API_TOKEN=your_huggingface_api_key
```

#### Azure OpenAI
- **`azure_deployment`**: Specifies the deployment to use, e.g `"gpt-35-turbo"`.
- **`api_version`**: Specifies the api version to use, e.g `"2023-06-01-preview"`.

In addition you have to define some variables in your environment (`.env`)

```
AZURE_OPENAI_API_KEY=your_azure_api_key
AZURE_OPENAI_ENDPOINT=https://your-endpoint.openai.azure.com
```

### [prompt]
- **`template`**: Override the prompt template that is used (optional).
This command guides you through the most important settings. You can find more information on the setting options in the [Wiki](https://github.com/lukasrump/crllm/wiki#-configuration).

## ✨Usage
CRLLM is designed to be easy to use right from your terminal. Below are some examples of how you can leverage the tool.
Expand Down

0 comments on commit 7d93f59

Please sign in to comment.