-
-
Notifications
You must be signed in to change notification settings - Fork 222
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Temporary fix:for o1-xx model need to covert systemMessage to aiMessage. #850
Conversation
// eslint-disable-next-line @typescript-eslint/no-explicit-any | ||
const chatModel = (ChainManager.chain as any).last.bound; | ||
const modelName = chatModel.modelName || chatModel.model; | ||
const isO1Model = modelName.startsWith("o1"); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add a TODO: hack for o1 models, to be removed when they support system prompt
Added my comments. Just merged your other PR, this one needs a rebase. |
src/LLMProviders/chainManager.ts
Outdated
@@ -317,12 +317,26 @@ export default class ChainManager { | |||
this.validateChatModel(); | |||
this.validateChainInitialization(); | |||
|
|||
// eslint-disable-next-line @typescript-eslint/no-explicit-any | |||
const chatModel = (ChainManager.chain as any).last.bound; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What's the difference between using this vs. "ChatModelManager.getChatModel()"?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@logancyang At first, I plan to use ChatModelManager.getChatModel()
, but I copied your code for consistency.
We can install the latest langchain openai and o1 models support streaming now: langchain-ai/langchainjs#7229 |
Merging now. With one caveat: Copilot Plus still doesn't support o1 models because its chain runner has more system messages. Right now I have no intention of enabling o1 models in Copilot Plus because
For the long term, o1 may support system message, let's see. |
related: #828