-
Notifications
You must be signed in to change notification settings - Fork 5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Proxy PR for Long Context Capability 1513 #1591
Conversation
Proxy PR for Long Context
You can ask authors of other contrib-openai tests. They've all gone through this. |
I've gone through this process, but I don't remember the details. We should really document it in a prominent place. |
Or find a better solution. |
relevant to #1478 |
* Add new capability to handle long context * Make print conditional * Remove superfluous comment * Fix msg order * Allow user to specify max_tokens * Add ability to specify max_tokens per message; improve name * Improve doc and readability * Add tests * Improve documentation and add tests per Erik and Chi's feedback * Update notebook * Update doc string of add to agents * Improve doc string * improve notebook * Update github workflows for context handling * Update docstring * update notebook to use raw config list. * Update contrib-openai.yml remove _target * Fix code formatting * Fix workflow file * Update .github/workflows/contrib-openai.yml --------- Co-authored-by: Eric Zhu <ekzhu@users.noreply.github.com> Co-authored-by: Chi Wang <wang.chi@microsoft.com>
Why are these changes needed?
This is a new PR in lieu of #1513 because that one was made from fork and @sonichi wanted it to be autogen.
So I created a new branch in microsoft/autogen, merged my fork into, and created a new PR
Related issue number
#1513
Checks