-
I see that Codestral is recommended for both chat and completion. Can Tabby use the same instance of Codestral for both of these tasks? Also, can I use a quantized model (like this one)? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Hi @abceleung, Thank you for trying Tabby! Yes, you can use the same instance when you specify the same configuration. A simple example would be running Tabby with the same model designated for both completion and chat: tabby serve --model Codestral-22B --chat-model Codestral-22B By default, Tabby utilizes the Q8 quantized model. However, if you need additional quantized models, you can fork the repository at https://github.com/tabbyml/registry-tabby to create your own registry. |
Beta Was this translation helpful? Give feedback.
Hi @abceleung,
Thank you for trying Tabby!
Yes, you can use the same instance when you specify the same configuration. A simple example would be running Tabby with the same model designated for both completion and chat:
By default, Tabby utilizes the Q8 quantized model. However, if you need additional quantized models, you can fork the repository at https://github.com/tabbyml/registry-tabby to create your own registry.