-
Notifications
You must be signed in to change notification settings - Fork 2.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ChatOpenAi: Streaming blocked w/ response_format "json_schema" #6884
Comments
I found a similar discussion regarding the issue you're facing: ChatOpenAi: Streaming blocked w/ response_format "json_schema" [1]. This discussion highlights that the latest version of LangChain.js does not support streaming with the However, recent tests indicate that streaming with import { z } from "zod";
import { zodToJsonSchema } from "zod-to-json-schema";
import { ChatOpenAI } from "../chat_models.js";
import { concat } from "@langchain/core/utils/stream";
import { AIMessageChunk } from "@langchain/core/messages";
const weatherSchema = z.object({
city: z.string().describe("The city to get the weather for"),
state: z.string().describe("The state to get the weather for"),
zipCode: z.string().describe("The zip code to get the weather for"),
unit: z.enum(["fahrenheit", "celsius"]).describe("The unit to get the weather in"),
});
const model = new ChatOpenAI({
model: "gpt-4o-2024-08-06",
}).bind({
response_format: {
type: "json_schema",
json_schema: {
name: "get_current_weather",
description: "Get the current weather in a location",
schema: zodToJsonSchema(weatherSchema),
strict: true,
},
},
});
const stream = await model.stream("What is the weather in San Francisco, 91626 CA?");
let full: AIMessageChunk | undefined;
for await (const chunk of stream) {
full = !full ? chunk : concat(full, chunk);
}
console.log(full); To restore this functionality, ensure you are using the latest version of LangChain.js and verify that your implementation aligns with the updated tests and examples [2]. |
Thanks @dosu. A couple notes:
|
The latest version of LangChain.js blocks streaming with the if (
params.response_format &&
params.response_format.type === "json_schema"
) {
console.warn(
`OpenAI does not yet support streaming with "response_format" set to "json_schema". Falling back to non-streaming mode.`
);
const res = await this._generate(messages, options, runManager);
const chunk = new ChatGenerationChunk({
message: new AIMessageChunk({
...res.generations[0].message,
}),
text: res.generations[0].text,
generationInfo: res.generations[0].generationInfo,
});
yield chunk;
return runManager?.handleLLMNewToken(
res.generations[0].text ?? "",
undefined,
undefined,
undefined,
undefined,
{ chunk }
);
} This code explicitly checks if the If you believe this decision is a mistake and OpenAI does support streaming with |
Hey, yes it looks like this is now supported. Will patch, thanks for flagging! |
Checked other resources
Example Code
Error Message and Stack Trace (if applicable)
OpenAI does not yet support streaming with "response_format" set to "json_schema". Falling back to non-streaming mode.
From: https://github.com/langchain-ai/langchainjs/blob/main/libs/langchain-openai/src/chat_models.ts#L1213-L1237
Description
Ref: #6821.
My issue is an unexpected, and seemingly unnecessary, reduction in capability with a recent release.
I am using ChatOpenAI with the new option for response_format json_schema. I am attempting to enable this with a streamed output. The .stream() method was previously working with response_format json_schema (meaning the response was streamed in multiple chunks and adhered to the provided schema).
With the latest @langchain/openai I am receiving a warning:
OpenAI does not yet support streaming with "response_format" set to "json_schema". Falling back to non-streaming mode.
. Along with this warning, the streaming sequence is changed to a non-streaming sequence.I can go into the @langchain/openai/dist/chat_models.js dependency and remove the check for json_schema mode, after which streaming is once again working. Therefore the block on this functionality is unnecessary.
System Info
"npm info @langchain/openai"
The text was updated successfully, but these errors were encountered: