-
Notifications
You must be signed in to change notification settings - Fork 5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Portkey Integration with Autogen #3395
Conversation
@microsoft-github-policy-service agree company="Portkey" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the PR, @siddharthsambharia-portkey . Could you also provide a notebook example so it
|
@jackgerrits -- can we please double check the language for ecosystem. |
@siddharthsambharia-portkey, thanks so much for the contribution! I have a few questions if I can regarding non-OpenAI providers (such as Anthropic, Cohere, Groq, Mistral, etc.):
I tried the sample code but I couldn't get it to balance/fallback. I wanted it to use Anthropic and Groq, both in the same group chat of the sample code. Is there anyway to test that? At the moment I'm trying to understand which AutoGen client class it is going to use when it load balances / fallbacks between different providers. |
Co-authored-by: gagb <gagb@users.noreply.github.com>
Co-authored-by: gagb <gagb@users.noreply.github.com>
Co-authored-by: gagb <gagb@users.noreply.github.com>
For provider-specific configuration parameters like top_k, you can add these fields as override_params in the config object. These are specified in the targets list/dictionary of the Portkey config. link to config Docs- https://docs.portkey.ai/docs/api-reference/config-object JSON
|
@gagb @marklysze I have made the resolved all the comments, is there anything else required to merge the PR? Thanks! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Consider extending the write up with your answers to the questions that @marklysze asked-- others are likely to have similar questions. Otherwise looks good to go.
Yes, I will update the write-up and update Portkey docs for the same as well! |
Yes, appreciate the challenge here (as we've had to create specific client/provider classes to cater for these cases). @siddharthsambharia-portkey and @gagb, do we need to make a note of the preference for OpenAI API compatible providers to ensure better compatibility? |
Yes that sounds like a good idea (at least for the current version of AutoGen). |
Yes, we could do that ideally but the models keep changing every few months, you can never be certain. I think the user can decide what model works best for his use case |
@marklysze @gagb Is there anything else you are waiting for to merge this PR? |
My only concern without making any mention of a possible incompatibility due to the messaging structure is users raising issues when they get an incompatible role order. I think the wording can be simple, perhaps something like. |
You're right; this could impact the user experience. However, Portkey serves as an AI Gateway, allowing users to easily switch between different models without altering the request process. Alternatively Autogen can recommend the best models for agents, and Portkey can facilitate using those models. This setup gives users flexibility to change models as new ones become available or for A/B testing, like switching from OpenAI to Llama 3.1 |
@siddharthsambharia-portkey -- can you please fix the code formatting errors using |
sure that's awesome. I will do that right away |
cc @gagb @marklysze |
@siddharthsambharia-portkey, marks suggest text is still not there? Can we please add that :) |
Yes my bad, added it. @gagb |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good @siddharthsambharia-portkey, thanks!
@siddharthsambharia-portkey, can you please:
|
done @marklysze, there was alredy a blank line at the end of the file although |
Hey @gagb @marklysze LGTM? |
@qingyun-wu @marklysze
Why are these changes needed?
Created documentation for Integrating Portkey with Autogen. It provides a brief overview of Portkey's features and explains how it can be used to bring AutoGen agents into production.
Related issue number
Closes #3394
Checks