Skip to content

Commit

Permalink
OpenAI: explicit value for MaxToken and Temp
Browse files Browse the repository at this point in the history
Because when k8sgpt talks with vLLM, the default MaxToken is 16,
which is so small.
Given the most model supports 2048 token(like Llama1 ..etc), so
put here for a safe value.

Signed-off-by: Peter Pan <Peter.Pan@daocloud.io>
  • Loading branch information
panpan0000 committed Sep 15, 2023
1 parent abfb584 commit 8e5325e
Showing 1 changed file with 2 additions and 0 deletions.
2 changes: 2 additions & 0 deletions pkg/ai/openai.go
Original file line number Diff line number Diff line change
Expand Up @@ -66,6 +66,8 @@ func (c *OpenAIClient) GetCompletion(ctx context.Context, prompt string, promptT
Content: fmt.Sprintf(promptTmpl, c.language, prompt),
},
},
MaxTokens: 2048,
Temperature: 0.7,
})
if err != nil {
return "", err
Expand Down

0 comments on commit 8e5325e

Please sign in to comment.