-
-
Notifications
You must be signed in to change notification settings - Fork 4.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Regression in support of customized "role" in OpenAI compatible API (v.0.4.2) #4755
Comments
This is likely the result of #4355, which made class CustomChatCompletionContentPartParam(TypedDict, total=False):
__pydantic_config__ = ConfigDict(extra="allow") # type: ignore
type: Required[str]
"""The type of the content part."""
ChatCompletionContentPartParam = Union[
openai.types.chat.ChatCompletionContentPartParam,
CustomChatCompletionContentPartParam]
class CustomChatCompletionMessageParam(TypedDict, total=False):
"""Enables custom roles in the Chat Completion API."""
role: Required[str]
"""The role of the message's author."""
content: Union[str, List[ChatCompletionContentPartParam]]
"""The contents of the message."""
name: str
"""An optional name for the participant.
Provides the model information to differentiate between participants of the
same role.
"""
ChatCompletionMessageParam = Union[
openai.types.chat.ChatCompletionMessageParam,
CustomChatCompletionMessageParam]
class ChatCompletionRequest(OpenAIBaseModel):
messages: List[ChatCompletionMessageParam]
... # The rest is the same as OpenAI API |
Thank you for the PR @DarkLight1337. Was wondering why my data pipeline stopped working when I just upgraded vLLM. |
Thank you, @simon-mo, @DarkLight1337 and @Tostino ! |
Merged. |
Discussed in #4745
Originally posted by tanliboy May 10, 2024
Hi vLLM team,
We have been using vLLM for serving models, and it went really well. We have been using the OpenAI compatible API along with our customized "role" for different entities. However, when we upgraded the version to v0.4.2 recently, we realized that the customized "role" is not supported and the role is only limited to "system", "user", and "assistant".
I understand that it is tightly aligned with OpenAI's chat completion role definition; however, it limits the customization of different roles along with fine-tuning. Moreover, we also saw a trend (including the recent Llama3 chat template) to support different roles for multi-agent interactions.
Can you upgrade to bring back the previous support of customized roles in OpenAI chat completion APIs?
Thanks,
Li
The text was updated successfully, but these errors were encountered: