Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: openAI explicit value for maxToken and temperature #659

Merged
merged 2 commits into from
Sep 18, 2023

Commits on Sep 18, 2023

  1. feat: openAI explicit value for maxToken and temp

    Because when k8sgpt talks with vLLM, the default MaxToken is 16,
    which is so small.
    Given the most model supports 2048 token(like Llama1 ..etc), so
    put here for a safe value.
    
    Signed-off-by: Peter Pan <Peter.Pan@daocloud.io>
    panpan0000 committed Sep 18, 2023
    Configuration menu
    Copy the full SHA
    30f066d View commit details
    Browse the repository at this point in the history
  2. feat: make temperature a flag

    Signed-off-by: Peter Pan <Peter.Pan@daocloud.io>
    panpan0000 committed Sep 18, 2023
    Configuration menu
    Copy the full SHA
    dcec8f4 View commit details
    Browse the repository at this point in the history