Skip to content

Commit

Permalink
Merge pull request #65 from tak-bro/feature/add-exclude-files
Browse files Browse the repository at this point in the history
feat: add exclude option
  • Loading branch information
tak-bro authored Aug 15, 2024
2 parents bca7c35 + d2b3bea commit a5ee634
Show file tree
Hide file tree
Showing 7 changed files with 76 additions and 37 deletions.
46 changes: 29 additions & 17 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -135,12 +135,11 @@ aicommit2 --all # or -a
- If you give this option, **_aicommit2_ will not commit**.
- `--generate` or `-g`: Number of messages to generate (default: **1**)
- **Warning**: This uses more tokens, meaning it costs more.
- `--prompt` or `-p`: System prompt for fine-tuning
- **Warning**: This option is **not recommended**. Please use `systemPrompt` or `systemPromptPath` for each model.
- `--exclude` or `-x`: Files to exclude from AI analysis

Example:
```sh
aicommit2 --locale "jp" --all --type "conventional" --generate 3 --clipboard
aicommit2 --locale "jp" --all --type "conventional" --generate 3 --clipboard --exclude "*.json" --exclude "*.ts"
```

### Git hook
Expand Down Expand Up @@ -217,19 +216,20 @@ model[]=codestral
The following settings can be applied to most models, but support may vary.
Please check the documentation for each specific model to confirm which settings are supported.

| Setting | Description | Default |
|--------------------|----------------------------------------------------------------------|--------------|
| `systemPrompt` | System Prompt text | - |
| `systemPromptPath` | Path to system prompt file | - |
| `timeout` | Request timeout (milliseconds) | 10000 |
| `temperature` | Model's creativity (0.0 - 2.0) | 0.7 |
| `maxTokens` | Maximum number of tokens to generate | 1024 |
| `locale` | Locale for the generated commit messages | en |
| `generate` | Number of commit messages to generate | 1 |
| `type` | Type of commit message to generate | conventional |
| `maxLength` | Maximum character length of the Subject of generated commit message | 50 |
| `logging` | Enable logging | true |
| `ignoreBody` | Whether the commit message includes body | true |
| Setting | Description | Default |
|--------------------|---------------------------------------------------------------------|--------------|
| `systemPrompt` | System Prompt text | - |
| `systemPromptPath` | Path to system prompt file | - |
| `exclude` | Files to exclude from AI analysis | - |
| `timeout` | Request timeout (milliseconds) | 10000 |
| `temperature` | Model's creativity (0.0 - 2.0) | 0.7 |
| `maxTokens` | Maximum number of tokens to generate | 1024 |
| `locale` | Locale for the generated commit messages | en |
| `generate` | Number of commit messages to generate | 1 |
| `type` | Type of commit message to generate | conventional |
| `maxLength` | Maximum character length of the Subject of generated commit message | 50 |
| `logging` | Enable logging | true |
| `ignoreBody` | Whether the commit message includes body | true |

> 👉 **Tip:** To set the General Settings for each model, use the following command.
> ```shell
Expand All @@ -255,6 +255,18 @@ aicommit2 config set systemPrompt="Generate git commit message."
aicommit2 config set systemPromptPath="/path/to/user/prompt.txt"
```

##### exclude

- Files to exclude from AI analysis
- It is applied with the `--exclude` option of the CLI option. All files excluded through `--exclude` in CLI and `exclude` general setting.

```sh
aicommit2 config set exclude="*.ts"
aicommit2 config set exclude="*.ts,*.json"
```

> NOTE: `exclude` option does not support per model. It is **only** supported by General Settings.
##### timeout

The timeout for network requests in milliseconds.
Expand Down Expand Up @@ -432,7 +444,7 @@ aicommit2 config set OLLAMA.model="llama3,codellama" # for multiple models
aicommit2 config add OLLAMA.model="gemma2" # Only Ollama.model can be added.
```

> OLLAMA.model is only **string array** type to support multiple Ollama. Please see [this section](#loading-multiple-ollama-models).
> OLLAMA.model is **string array** type to support multiple Ollama. Please see [this section](#loading-multiple-ollama-models).
##### OLLAMA.host

Expand Down
20 changes: 10 additions & 10 deletions src/commands/aicommit2.ts
Original file line number Diff line number Diff line change
Expand Up @@ -34,16 +34,6 @@ export default async (
await execa('git', ['add', '--update']); // NOTE: should be equivalent behavior to `git commit --all`
}

const detectingFilesSpinner = consoleManager.displaySpinner('Detecting staged files');
const staged = await getStagedDiff(excludeFiles);
detectingFilesSpinner.stop();
if (!staged) {
throw new KnownError(
'No staged changes found. Stage your changes manually, or automatically stage all changes with the `--all` flag.'
);
}
consoleManager.printStagedFiles(staged);

const config = await getConfig(
{
locale: locale?.toString() as string,
Expand All @@ -62,6 +52,16 @@ export default async (
}
}

const detectingFilesSpinner = consoleManager.displaySpinner('Detecting staged files');
const staged = await getStagedDiff(excludeFiles, config.exclude);
detectingFilesSpinner.stop();
if (!staged) {
throw new KnownError(
'No staged changes found. Stage your changes manually, or automatically stage all changes with the `--all` flag.'
);
}
consoleManager.printStagedFiles(staged);

const availableAIs: ModelName[] = Object.entries(config)
.filter(([key]) => modelNames.includes(key as ModelName))
.map(([key, value]) => [key, value] as [ModelName, RawConfig])
Expand Down
2 changes: 1 addition & 1 deletion src/managers/ai-request.manager.ts
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ import { HuggingFaceService } from '../services/ai/hugging-face.service.js';
import { MistralService } from '../services/ai/mistral.service.js';
import { OllamaService } from '../services/ai/ollama.service.js';
import { OpenAIService } from '../services/ai/openai.service.js';
import { PerplexityService } from '../services/ai/perplexity.js';
import { PerplexityService } from '../services/ai/perplexity.service.js';
import { ModelName, ValidConfig } from '../utils/config.js';
import { StagedDiff } from '../utils/git.js';

Expand Down
6 changes: 3 additions & 3 deletions src/services/ai/ollama.service.ts
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ import { Observable, catchError, concatMap, from, map, of } from 'rxjs';
import { fromPromise } from 'rxjs/internal/observable/innerFrom';

import { AIService, AIServiceError, AIServiceParams, CommitMessage } from './ai.service.js';
import { DEFAULT_OLLMA_HOST } from '../../utils/config.js';
import { DEFAULT_OLLAMA_HOST } from '../../utils/config.js';
import { KnownError } from '../../utils/error.js';
import { createLogResponse } from '../../utils/log.js';
import { DEFAULT_PROMPT_OPTIONS, PromptOptions, generatePrompt } from '../../utils/prompt.js';
Expand All @@ -15,7 +15,7 @@ import { HttpRequestBuilder } from '../http/http-request.builder.js';
export interface OllamaServiceError extends AIServiceError {}

export class OllamaService extends AIService {
private host = DEFAULT_OLLMA_HOST;
private host = DEFAULT_OLLAMA_HOST;
private model = '';
private ollama: Ollama;

Expand All @@ -31,7 +31,7 @@ export class OllamaService extends AIService {
.hex(this.colors.secondary)
.bold(`[${capitalizeFirstLetter(this.model)}]`);
this.errorPrefix = chalk.red.bold(`[${capitalizeFirstLetter(this.model)}]`);
this.host = this.params.config.host || DEFAULT_OLLMA_HOST;
this.host = this.params.config.host || DEFAULT_OLLAMA_HOST;
this.ollama = new Ollama({ host: this.host });
}

Expand Down
File renamed without changes.
36 changes: 31 additions & 5 deletions src/utils/config.ts
Original file line number Diff line number Diff line change
Expand Up @@ -13,13 +13,24 @@ import type { TiktokenModel } from '@dqbd/tiktoken';
const commitTypes = ['', 'conventional', 'gitmoji'] as const;
export type CommitType = (typeof commitTypes)[number];

export const DEFAULT_OLLMA_HOST = 'http://localhost:11434';
export const DEFAULT_OLLAMA_HOST = 'http://localhost:11434';

const { hasOwnProperty } = Object.prototype;

export const hasOwn = (object: unknown, key: PropertyKey) => hasOwnProperty.call(object, key);

export const modelNames = ['OPENAI', 'OLLAMA', 'HUGGINGFACE', 'GEMINI', 'ANTHROPIC', 'MISTRAL', 'CODESTRAL', 'COHERE', 'GROQ', 'PERPLEXITY'] as const;
export const modelNames = [
'OPENAI',
'OLLAMA',
'HUGGINGFACE',
'GEMINI',
'ANTHROPIC',
'MISTRAL',
'CODESTRAL',
'COHERE',
'GROQ',
'PERPLEXITY',
] as const;
export type ModelName = (typeof modelNames)[number];

const parseAssert = (name: string, condition: any, message: string) => {
Expand Down Expand Up @@ -144,6 +155,13 @@ const generalConfigParsers = {
parseAssert('ignoreBody', /^(?:true|false)$/.test(ignore), 'Must be a boolean(true or false)');
return ignore === 'true';
},
exclude: (exclude?: string | string[]): string[] => {
if (!exclude) {
return [];
}
const excludeFiles = typeof exclude === 'string' ? exclude?.split(',') : exclude;
return excludeFiles.map(file => file.trim()).filter(file => !!file && file.length > 0);
},
} as const;

const modelConfigParsers: Record<ModelName, Record<string, (value: any) => any>> = {
Expand Down Expand Up @@ -320,7 +338,7 @@ const modelConfigParsers: Record<ModelName, Record<string, (value: any) => any>>
},
host: (host?: string) => {
if (!host) {
return DEFAULT_OLLMA_HOST;
return DEFAULT_OLLAMA_HOST;
}
parseAssert('OLLAMA.host', /^https?:\/\//.test(host), 'Must be a valid URL');
return host;
Expand Down Expand Up @@ -475,8 +493,8 @@ const readConfigFile = async (): Promise<RawConfig> => {

const configString = await fs.readFile(configPath, 'utf8');
let config = ini.parse(configString);
const hasOllmaModel = hasOwn(config, 'OLLAMA') && hasOwn(config['OLLAMA'], 'model');
if (hasOllmaModel) {
const hasOllamaModel = hasOwn(config, 'OLLAMA') && hasOwn(config['OLLAMA'], 'model');
if (hasOllamaModel) {
config = {
...config,
OLLAMA: {
Expand All @@ -485,6 +503,14 @@ const readConfigFile = async (): Promise<RawConfig> => {
},
};
}

const hasExclude = hasOwn(config, 'exclude');
if (hasExclude) {
config = {
...config,
exclude: typeof config.exclude === 'string' ? [config.exclude] : config.exclude,
};
}
return config;
};

Expand Down
3 changes: 2 additions & 1 deletion src/utils/git.ts
Original file line number Diff line number Diff line change
Expand Up @@ -28,13 +28,14 @@ const filesToExclude = [
'*.png',
].map(excludeFromDiff);

export const getStagedDiff = async (excludeFiles?: string[]): Promise<StagedDiff | null> => {
export const getStagedDiff = async (excludeFiles?: string[], exclude?: string[]): Promise<StagedDiff | null> => {
const diffCached = ['diff', '--cached', '--diff-algorithm=minimal'];
const { stdout: files } = await execa('git', [
...diffCached,
'--name-only',
...filesToExclude,
...(excludeFiles ? excludeFiles.map(excludeFromDiff) : []),
...(exclude ? exclude.map(excludeFromDiff) : []),
]);

if (!files) {
Expand Down

0 comments on commit a5ee634

Please sign in to comment.