You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I tested Mistral-Large-2407 with v0.6.3post1 and got really strange results when using a long context. With small context it worked well.
With v0.6.1post2 everything works as expected.
The output looks like this:
Abbetzeilen: OFFENDELETZuswelternationalis von dem Aufgültigkeitschiff OFFENDE OFFENDEUTBereichter: OFFENDE ENDE OFFENGebsowie die Aufgaben: OFFENDETAUmeter: OFFENFOR OFFEN OFFENDESTELLIPS-
Stätzugennt-
Stellungskomitestellung durch den Aufgeler:
22
• räus-
</ AufgStell OFFENG
</
</contextsowie geordnungsfür
</output OFFENDE
(Aufungsl.zugenä OFFENDESpezioffnung durch den Aufgabarbei. OFFENDE OFFENF OFFENDE OFFENDE Dokritänder:
</textAufgabenstellt AufgAbbbittungabeitzei. OFFENDESto Aufgabus OFFENDEKontroller: Aufgaben: Aufgessen Aufgaben OFFENDE – OFFEN) OFFENDE ENDETAusallem OFFEN DE -
</ Aufgaben sicherweiteration: Aufgabeiten-
</
Before submitting a new issue...
Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.
The text was updated successfully, but these errors were encountered:
Your current environment
vllm-tgi:
container_name: vllm-tgi
image: vllm/vllm-openai:v0.6.3.post1
restart: always
shm_size: "16gb"
command: "--model /model --served-model-name mistral-large-123b --tensor-parallel-size 4 --port 8081 --api-key kitch_vllm --tokenizer_mode mistral --load_format safetensors --config_format mistral"
ports:
- "8081:8081"
environment:
- HTTP_PROXY=
- HTTPS_PROXY=
- http_proxy=
- https_proxy=
volumes:
- ${PATH_MODEL_GENERATION}:/model
deploy:
resources:
reservations:
devices:
- driver: nvidia
device_ids: [ '0', '1', '2', '3' ]
capabilities: [ gpu ]
Model Input Dumps
No response
🐛 Describe the bug
I tested Mistral-Large-2407 with v0.6.3post1 and got really strange results when using a long context. With small context it worked well.
With v0.6.1post2 everything works as expected.
The output looks like this:
Before submitting a new issue...
The text was updated successfully, but these errors were encountered: