Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Hotfix] Fix link issue in tutorial #294

Merged
merged 2 commits into from
Jun 14, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions docs/sphinx_doc/en/source/tutorial/104-usecase.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,13 +10,13 @@ Let the adventure begin to unlock the potential of multi-agent applications with

## Getting Started

Firstly, ensure that you have installed and configured AgentScope properly. Besides, we will involve the basic concepts of `Model API`, `Agent`, `Msg`, and `Pipeline,` as described in [Tutorial-Concept](101-agentscope).
Firstly, ensure that you have installed and configured AgentScope properly. Besides, we will involve the basic concepts of `Model API`, `Agent`, `Msg`, and `Pipeline,` as described in [Tutorial-Concept](101-agentscope.md).

**Note**: all the configurations and code for this tutorial can be found in `examples/game_werewolf`.

### Step 1: Prepare Model API and Set Model Configs

As we discussed in the last tutorial, you need to prepare your model configurations into a JSON file for standard OpenAI chat API, FastChat, and vllm. More details and advanced usages such as configuring local models with POST API are presented in [Tutorial-Model-API](203-model).
As we discussed in the last tutorial, you need to prepare your model configurations into a JSON file for standard OpenAI chat API, FastChat, and vllm. More details and advanced usages such as configuring local models with POST API are presented in [Tutorial-Model-API](203-model.md).

```json
[
Expand Down
4 changes: 2 additions & 2 deletions docs/sphinx_doc/zh_CN/source/tutorial/104-usecase.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,13 +11,13 @@

## 开始

首先,确保您已经正确安装和配置好AgentScope。除此之外,本节内容会涉及到`Model API`, `Agent`, `Msg`和 `Pipeline`这几个概念(详情可以参考[关于AgentScope](101-agentscope))。以下是本节教程内容概览。
首先,确保您已经正确安装和配置好AgentScope。除此之外,本节内容会涉及到`Model API`, `Agent`, `Msg`和`Pipeline`这几个概念(详情可以参考[关于AgentScope](101-agentscope.md))。以下是本节教程内容概览。

**提示**:本教程中的所有配置和代码文件均可以在`examples/game_werewolf`中找到。

### 第一步: 准备模型API和设定模型配置

就像我们在上一节教程中展示的,您需要为了您选择的OpenAI chat API, FastChat, 或vllm 准备一个JSON样式的模型配置文件。更多细节和高阶用法,比如用POST API配置本地模型,可以参考[关于模型](203-model)。
就像我们在上一节教程中展示的,您需要为了您选择的OpenAI chat API, FastChat, 或vllm准备一个JSON样式的模型配置文件。更多细节和高阶用法,比如用POST API配置本地模型,可以参考[关于模型](203-model.md)。

```json
[
Expand Down