Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

update OPENAI_API_BASE to OPEN_LLM_API_BASE #27

Merged
merged 1 commit into from
Nov 23, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 3 additions & 4 deletions src/guide/tutorials/integration_with_open_llm.md
Original file line number Diff line number Diff line change
Expand Up @@ -179,12 +179,11 @@ Such as LLaMA-Factory, FastChat, vllm openai compatible interface

**config/key.yaml**
```yaml
OPENAI_API_BASE: "http://0.0.0.0:8000/v1"
OPENAI_API_KEY: "sk-xxx"
OPENAI_API_MODEL: "llama2-13b"
OPEN_LLM_API_BASE: "http://106.75.10.65:8001/v1"
OPEN_LLM_API_MODEL: "llama2-13b"
```

The complete routing of the openapi interface `http://0.0.0.0:8000/v1/chat/completions`, `OPENAI_API_BASE` only needs to be configured to `http://0.0.0.0:8000/v1`, and the remaining parts will be filled by openai sdk itself. `OPENAI_API_KEY` needs to be set to any value starting with `sk-`. `OPENAI_API_MODEL` is the actual value of the request interface parameter `model`.
The complete routing of the openapi interface `http://0.0.0.0:8000/v1/chat/completions`, `OPEN_LLM_API_BASE` only needs to be configured to `http://0.0.0.0:8000/v1`, and the remaining parts will be filled by openai sdk itself. `OPEN_LLM_API_MODEL` is the actual value of the request interface parameter `model`.

#### ollama api interface

Expand Down
7 changes: 3 additions & 4 deletions src/zhcn/guide/tutorials/integration_with_open_llm.md
Original file line number Diff line number Diff line change
Expand Up @@ -179,12 +179,11 @@ curl -X POST http://localhost:11434/api/generate -d '{

**config/key.yaml**
```yaml
OPENAI_API_BASE: "http://0.0.0.0:8000/v1"
OPENAI_API_KEY: "sk-xxx"
OPENAI_API_MODEL: "llama2-13b"
OPEN_LLM_API_BASE: "http://106.75.10.65:8001/v1"
OPEN_LLM_API_MODEL: "llama2-13b"
```

openapi接口的完整路由`http://0.0.0.0:8000/v1/chat/completions`,`OPENAI_API_BASE`只需要配置到`http://0.0.0.0:8000/v1` ,剩余部分openai sdk会补齐。`OPENAI_API_KEY`需要设置为以`sk-`开头的任意值。`OPENAI_API_MODEL`为请求接口参数`model`的实际值。
openapi接口的完整路由`http://0.0.0.0:8000/v1/chat/completions`,`OPEN_LLM_API_BASE`只需要配置到`http://0.0.0.0:8000/v1` ,剩余部分openai sdk会补齐。`OPEN_LLM_API_MODEL`为请求接口参数`model`的实际值。


#### ollama api接口
Expand Down
Loading