-
Notifications
You must be signed in to change notification settings - Fork 5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Delivering multiple system messages is not supported by the local model deployed by the fschat-wrapped OpenAI API, which only supports a single system message #595
Comments
It looks to me like these APIs would also have a problem with the "name" attribute that is important for Group Chat scenarios. I'm not sure compatibility will be easy to achieve. @LittleLittleCloud |
For the multi-system-message issue, you can override the |
multiple 'system' messages are also causing a problem with using litellm proxy on claude 3 models. is there a work around for this issue? |
here is a potential fix, do you want to try to implement it? #1861 |
Increasingly I think we need middleware for messages. The format AutoGen uses should be able to diverge from that used by the LLMs, with messages transformed just prior to making the LLM call. |
I'm getting the following error using groupchats with Claude-3. I believe it might be related to this issue. Followed the example on the documentation for configuration.
|
* Add confirmation for messages produced by Coder * change confirmation to be on executor
unlikely that this will be implemented for 0.2 and 0.4 arch obviates the need, I think. please reopen if you object |
GroupChatManager requests the 'messages' of the model, passing multiple system messages.
However, for encapsulated OpenAI APIs like fschat, the setting of system messages is overridden, hence only one is retained.:
The text was updated successfully, but these errors were encountered: