Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Extract OpenAI module & Add max_tokens setting #10

Merged
merged 1 commit into from
Mar 17, 2023
Merged

Conversation

unixzii
Copy link
Member

@unixzii unixzii commented Mar 16, 2023

As we are going to support Dall-E feature soon, OpenAI client will be used across the modules. In this PR, we extract the client from Chat module to a separate module.

By the way, due to the capacity problems of OpenAI APIs, I also added a configuration item: max_tokens to limit the maximum tokens in the answer response, which can alleviate the problem.

@unixzii unixzii requested a review from ktiays March 16, 2023 16:06
@ktiays ktiays merged commit d0386f3 into master Mar 17, 2023
@unixzii unixzii deleted the feat/config branch March 17, 2023 09:38
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants