Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG]: topP is not set from the config yaml of the openai provider #1052

Closed
3 of 4 tasks
muscionig opened this issue Apr 4, 2024 · 1 comment · Fixed by #1053
Closed
3 of 4 tasks

[BUG]: topP is not set from the config yaml of the openai provider #1052

muscionig opened this issue Apr 4, 2024 · 1 comment · Fixed by #1053

Comments

@muscionig
Copy link
Contributor

muscionig commented Apr 4, 2024

Checklist

  • I've searched for similar issues and couldn't find anything matching
  • I've included steps to reproduce the behavior

Affected Components

  • K8sGPT (CLI)
  • K8sGPT Operator

K8sGPT Version

v0.3.29

Kubernetes Version

v1.27

Host OS and its Version

MacOS

Steps to reproduce

  1. Create an openai provider with a topp different than 1
  2. If you log the request, the parameter is hardcoded to 1 directly in the code
    see line:
    topP = 1.0

Expected behaviour

As topP is a valid parameter from the openAI spec: https://platform.openai.com/docs/api-reference/chat/create#chat-create-top_p; a user should be able to set this parameter.

In addition, there are many openai compatible backends that could take advantage of a user defined topP.

Actual behaviour

topP should be set from the config file.

Additional Information

No response

@muscionig
Copy link
Contributor Author

muscionig commented Apr 4, 2024

I have opened a PR that should fix this issue, assuming it is considered a bug.

Please let me know, and feel free to discard the PR if this fix is not needed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: Done
Development

Successfully merging a pull request may close this issue.

1 participant