-
Hi guys, I'm trying to find a way to send in parameters such as temperature and max_new_tokens when you send a query to a langcorn server. Something like this: {"question": "tell me a story about dogs.", There is no complaint when you send in things so you can add it. Is this possible and if so, how do you do it? Thanks |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 3 replies
-
Hi @mstensmo! Thx for your suggestion! It's not supported in the current version but I can add it similarly to x-llm-temperature: 0.7
x-max-tokens: 256
x-model-name: '...' |
Beta Was this translation helpful? Give feedback.
-
I just released this change to pypi |
Beta Was this translation helpful? Give feedback.
Hi @mstensmo! Thx for your suggestion! It's not supported in the current version but I can add it similarly to
x-llm-api-key
HTTP header in the new release.