Prepare for the new GPT 3.5 Turbo model #617
Merged
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Description of the Change
As part of their dev day last week, OpenAI announced that the base GPT 3.5 Turbo model will be updated to support 16K context length starting on December 11th.
This model can be used now by referencing
gpt-3.5-turbo-1106
instead of the genericgpt-3.5-turbo
and I had considered doing this but by keeping our reference togpt-3.5-turbo
, this will ensure we are always using the latest version of that model, which I would prefer.This PR just changes the max content length we consider, from the previous of 4,096 to the new of 16,385. This PR should be merged and released sometime between now and December 11th when this becomes the new default context length (not a huge risk to release this sooner, could mean users run into max token length errors though).
How to test the Change
Verify any OpenAI ChatGPT features still work as expected
Changelog Entry
Credits
Props @dkotter
Checklist: