-
-
Notifications
You must be signed in to change notification settings - Fork 5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: ToolCall IDs generated by Mistral tool call parser do not comply with Mistral tool calls and template constraints #9019
Comments
Hi! Yes this is correct - vLLM's standard tool call ID format is incompatible with Mistral's 9-digit (I think?) alphanumeric tool call ID. Instead of trying to alter vLLM's internal standard for tool call IDs to be compatible with Mistral, I opted to transform them in the provided chat templates to be compatible with Mistral's format. Details on mistral tool calling & chat templates are here in the dpcs To resolve this, i recommend using the |
since before my implementation, tool calls have been generated using the following: class ToolCall(OpenAIBaseModel):
id: str = Field(default_factory=lambda: f"chatcmpl-tool-{random_uuid()}")
type: Literal["function"] = "function"
function: FunctionCall I opted not to change this, but it could probably be overridden in the mistral tool parser for anyone who wants to rely on the default mistral chat template. |
@K-Mistele , thanks for providing details ! The I proposed in this PR to use a dedicated |
Your current environment
The output of `python collect_env.py`
Model Input Dumps
No response
🐛 Describe the bug
IDs generated for tool calls parsed by the Mistral parser (flag
--tool-call-parser=mistral
) do not respect the Mistral ToolCall id naming constraints and therefore cannot be used in subsequent function call workflow (the error from the template is raised).Before submitting a new issue...
The text was updated successfully, but these errors were encountered: