Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Premature close error thrown from fetch not handled #294

Closed
1 task done
themez opened this issue Sep 7, 2023 · 6 comments
Closed
1 task done

Premature close error thrown from fetch not handled #294

themez opened this issue Sep 7, 2023 · 6 comments
Labels
bug Something isn't working

Comments

@themez
Copy link

themez commented Sep 7, 2023

Confirm this is a Node library issue and not an underlying OpenAI API issue

  • This is an issue with the Node library

Describe the bug

when node-fetch response stream emitted an "Premature close" error, it will cause the process exit with an Uncaught error

Error: Premature close
     at IncomingMessage.<anonymous> (/app/node_modules/.pnpm/node-fetch@2.6.9/node_modules/node-fetch/lib/index.js:1749:18)
     at Object.onceWrapper (node:events:627:28)
     at IncomingMessage.emit (node:events:513:28)
     at IncomingMessage.emit (node:domain:489:12)
     at emitCloseNT (node:internal/streams/destroy:132:10)
     at process.processTicksAndRejections (node:internal/process/task_queues:81:21)

node-fetch throwing error:
https://github.com/node-fetch/node-fetch/blob/v2.6.9/src/index.js#L156

the stream parser should handle stream error event here:
https://github.com/openai/openai-node/blob/master/src/streaming.ts#L253

To Reproduce

When OpenAI API calling encountered an error, the process will be exited, because the stream error event is not handled

Code snippets

No response

OS

macOS

Node version

v19.2.0

Library version

openai v4.3.1

@themez themez added the bug Something isn't working label Sep 7, 2023
@rattrayalex
Copy link
Collaborator

Can you share code to reproduce this? Was that the full traceback, or was it truncated?

What happens if you wrap your for await in a try/catch block?

The code that you linked to seems to be for Node 14 & lower only; could it have been this block instead? Or could you be running in Node 14?

@themez
Copy link
Author

themez commented Sep 8, 2023

@rattrayalex you're right, I was referencing the wrong block location.

That was the full trace, let me try to write a snippet to reproduce this error (anyway it's not easy to reproduce on OpenAI official API endpoint) would come back after finished.

@themez
Copy link
Author

themez commented Sep 8, 2023

Here're simplified code snippets to reproduce this error, I copied some stream to async iterator codes from this library:

server part(no content length header)

import fastify from "fastify";
import { Readable } from "stream";
const server = fastify();

server.post("/v1", async (request, reply) => {
  const buffer = ["1", "2"];
  const response = new Readable({
    read() {
      const buf = buffer.shift();
      if (buf) {
        this.push(buf + "\n");
      } else {
        this.destroy();
      }
    },
  });
  reply
    //   .header("content-length", buffer.length)
    .send(response);
});

server.listen({ port: 3000 });

client part

async function readStream(body:Readable){
  const iter = readableStreamAsyncIterable<any>(body);

  for await (const chunk of iter) {
    console.log(chunk)
  }
}


function readableStreamAsyncIterable<T>(stream: any): AsyncIterableIterator<T> {
  if (stream[Symbol.asyncIterator]) return stream;
  // omit code below because stream has async iterator API on recent Node version
  // ...
}

async function main(){
  const result = await fetch("http://localhost:3000/v1", { method: "POST" });
  readStream(result.body)
}
main()

the root cause

The stream.destroy function emit an error event on the stream object

When reading stream with async iteration, the error event need to be handled or it would became an Uncaught error

@rattrayalex
Copy link
Collaborator

rattrayalex commented Sep 9, 2023

Got it, thank you @themez. I believe this is expected behavior, then - you should be wrapping your for await in a try/catch to handle any errors, such as a stream being prematurely terminated:

const stream = client.chat.completions.create(params);

try {
  for await (const chunk of stream) {
    console.log(chunk);
  }
  console.log('stream has successfully finished!'); // This should never print if the stream was destroyed prematurely.
} catch (err) {
  console.error('The stream did not finish successfully', err); // Could be due to a "Premature close" or other error event.
}
console.log("One way or another, we're done with the stream now.");

Note that the one exception to this is a an abort error triggered by the user; if you intentionally abort the stream, for example with a break statement, it won't raise an error, but merely end.

@themez
Copy link
Author

themez commented Sep 10, 2023

Got it, I thought the error should be caught in OpenAI SDK internal, because it looped reading the iterator and convert to SSE object, now I know the async iterator read would propagate the error. It also apply for read piping

For me, I read the async generator to a readable through Readable.from, here's how I caught this error finally:

const stream = client.chat.completions.create(params);
const readable = Readable.from(stream);
readable.on("error", (e) => {
    console.log('read caught');
    readable.destroy();
  });

@Cornelius000
Copy link

Asking for clarification: I feel like this shouldnt happen no? Does this mean that the openAI api server end the streaming to early in some cases? Why is that?
We get this error every 5 to 10 requests right now and dont really know why that happens.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants