Ollama local models context token length always displays 4096 even when LLMs support more #1297
mysterium-coniunctionis
started this conversation in
Help
Replies: 1 comment 1 reply
-
Do you know if the Ollama API gives us model context length? If so I can fix this in like 5min. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
For some reason, local Ollama models that support contexts larger than 4096 tokens do not seem to adjust to the greater token length on Chatbot-UI. Has this been addressed anywhere? If yes, will you point me in the right direction to fix this on my localhost instance?
Beta Was this translation helpful? Give feedback.
All reactions