-
Notifications
You must be signed in to change notification settings - Fork 927
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Premature close error thrown from fetch not handled #294
Comments
Can you share code to reproduce this? Was that the full traceback, or was it truncated? What happens if you wrap your The code that you linked to seems to be for Node 14 & lower only; could it have been this block instead? Or could you be running in Node 14? |
@rattrayalex you're right, I was referencing the wrong block location. That was the full trace, let me try to write a snippet to reproduce this error (anyway it's not easy to reproduce on OpenAI official API endpoint) would come back after finished. |
Here're simplified code snippets to reproduce this error, I copied some stream to async iterator codes from this library: server part(no content length header)import fastify from "fastify";
import { Readable } from "stream";
const server = fastify();
server.post("/v1", async (request, reply) => {
const buffer = ["1", "2"];
const response = new Readable({
read() {
const buf = buffer.shift();
if (buf) {
this.push(buf + "\n");
} else {
this.destroy();
}
},
});
reply
// .header("content-length", buffer.length)
.send(response);
});
server.listen({ port: 3000 }); client partasync function readStream(body:Readable){
const iter = readableStreamAsyncIterable<any>(body);
for await (const chunk of iter) {
console.log(chunk)
}
}
function readableStreamAsyncIterable<T>(stream: any): AsyncIterableIterator<T> {
if (stream[Symbol.asyncIterator]) return stream;
// omit code below because stream has async iterator API on recent Node version
// ...
}
async function main(){
const result = await fetch("http://localhost:3000/v1", { method: "POST" });
readStream(result.body)
}
main() the root causeThe stream.destroy function emit an When reading stream with async iteration, the error event need to be handled or it would became an Uncaught error |
Got it, thank you @themez. I believe this is expected behavior, then - you should be wrapping your const stream = client.chat.completions.create(params);
try {
for await (const chunk of stream) {
console.log(chunk);
}
console.log('stream has successfully finished!'); // This should never print if the stream was destroyed prematurely.
} catch (err) {
console.error('The stream did not finish successfully', err); // Could be due to a "Premature close" or other error event.
}
console.log("One way or another, we're done with the stream now."); Note that the one exception to this is a an |
Got it, I thought the error should be caught in OpenAI SDK internal, because it looped reading the iterator and convert to SSE object, now I know the async iterator read would propagate the error. It also apply for read piping For me, I read the async generator to a readable through const stream = client.chat.completions.create(params);
const readable = Readable.from(stream);
readable.on("error", (e) => {
console.log('read caught');
readable.destroy();
}); |
Asking for clarification: I feel like this shouldnt happen no? Does this mean that the openAI api server end the streaming to early in some cases? Why is that? |
Confirm this is a Node library issue and not an underlying OpenAI API issue
Describe the bug
when node-fetch response stream emitted an "Premature close" error, it will cause the process exit with an Uncaught error
node-fetch throwing error:
https://github.com/node-fetch/node-fetch/blob/v2.6.9/src/index.js#L156
the stream parser should handle stream error event here:
https://github.com/openai/openai-node/blob/master/src/streaming.ts#L253
To Reproduce
When OpenAI API calling encountered an error, the process will be exited, because the stream error event is not handled
Code snippets
No response
OS
macOS
Node version
v19.2.0
Library version
openai v4.3.1
The text was updated successfully, but these errors were encountered: