Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

stream.pipeline destroys writable stream when error is occurred #26311

Closed
yosuke-furukawa opened this issue Feb 26, 2019 · 24 comments
Closed

stream.pipeline destroys writable stream when error is occurred #26311

yosuke-furukawa opened this issue Feb 26, 2019 · 24 comments
Labels
feature request Issues that request new features to be added to Node.js. stale stream Issues and PRs related to the stream subsystem.

Comments

@yosuke-furukawa
Copy link
Member

  • Version:
    v11.10.0

  • Platform:
    Mac OS Darwin 16.7.0

  • Subsystem:
    stream, http

stream.pipeline is helpful to handle error and interoperable to Promise.
However, I found a behavior that is not suitable my usecase.

I am creating a web server with stream.pipeline.
If my readable stream emits an error like "file not found", I would like to send error response to my clients.

code example is as follows.

const fs = require('fs')
const http = require('http')
const { pipeline } = require('stream')

const server = http.createServer((req, res) => {
  const r = fs.createReadStream('./foo2.txt')
  pipeline(r, res, (err) => {
    if (err) {
      console.log(err) // No such file
      return res.end("error!!!"); // write error response but this message is not shown.
    }
  })
})

server.listen(9000)

I have investigated nodejs core, stream.pipeline destroys writable stream when error is occurred.
https://github.com/nodejs/node/blob/master/lib/internal/streams/pipeline.js#L42

so the above code cannot send error response.

Question

Is this an expected behaviour?
In my understandings, writable stream should be closed anyway, but in this case, we would like to close writable stream manually.

In this situation, we need to create custom writable stream?

I could send a PR to pass an option like { destroyOnError: false } to avoid destory automatically.

@mcollina
Copy link
Member

The whole reason of pipeline is to call destroy(). If you don’t want this behavior you can use plain old source.pipe().

@yosuke-furukawa
Copy link
Member Author

hm. I would like to use Promise and Stream seamlessly, stream.pipeline is seems to be helpful for my situation in API doc.

const pipeline = util.promisify(stream.pipeline);

async function run() {
  await pipeline(
    fs.createReadStream('archive.tar'),
    zlib.createGzip(),
    fs.createWriteStream('archive.tar.gz')
  );
  console.log('Pipeline succeeded.');
}

Anyway, I understood stream.pipeline should call destroy() automatically.

destroy function does not catch error object

I have tried to create custom Writable Stream to destroy manually.

const fs = require('fs')
const http = require('http')
const { pipeline, Writable } = require('stream')

class ResponseWriter extends Writable {
  constructor(options) {
    super(options);
    this.res = options.res;
  }

  _write(chunk, encoding, callback) {
    this.res.write(chunk);
    this.data += chunk;
    callback();
  }

  _final(callback) {
    this.data += this.res.end();
    callback();
  }
  
  _destroy(err, callback) {
    console.log("destroy", err); // does not show error.
    this.res.end("ERROR!! " + err);
    this.res.destroy(err);
    callback(err);
  }
}

const server = http.createServer((req, res) => {
  const r = fs.createReadStream('./foo2.txt')
  const w = new ResponseWriter({
    res,
  });
  pipeline(r, w, (err) => {
    console.log(err) // No such file
  });
})

server.listen(9000)

But this solution does not work well. cause _destroy cannot catch the err object.

in node.js core, stream.pipeline does not pass err object to the destroy function.

if (typeof stream.destroy === 'function') return stream.destroy(); // destroy(err) ??

Is this expected behavior? If not, I will send PR.

@mcollina
Copy link
Member

mcollina commented Mar 2, 2019

This is expected behavior. .destroy(err) is supposed to be called with the error that caused the current stream to fail. If you want to always handle the "outer" logic, you should do that in the pipeline callback (or returned promise). Note that you don't want to call stream.destroy(err) because it will cause all streams to emit 'error'  events, which can cause nasty side effects because of missing 'error'  handlers. I don't think we can support that.

Note that from an http standpoint, adding  this.res.end("ERROR!! " + err); after a byte is sent is not correct, as you'll be appending to data that is already flowing.

I think the source of these problems is not pipeline or pipe, but rather that the implementation of HttpResponse is not matching what you'd expect. Essentially if you are piping a stream into a response, you want to handle error conditions on a case-by-cases basis, making pipeline useless for you.

@lpinca lpinca added the stream Issues and PRs related to the stream subsystem. label Mar 3, 2019
@yosuke-furukawa
Copy link
Member Author

Essentially if you are piping a stream into a response, you want to handle error conditions on a case-by-cases basis, making pipeline useless for you.

I agree with that, but my opinion is different.

pipeline would be better to pass an option (pipeline(...streams, [option], callback)) to handle case-by-case basis like Stream.finished that has an option to handle case-by-case basis (finished(stream[, options], callback)).

https://nodejs.org/api/stream.html#stream_stream_finished_stream_options_callback

I know HttpResponse usecase is a little bit different from other stream usecases (http.request, fs.read/write), but other TCP/UDP network server using stream like ftp would return error response to their clients, this usecase is not so rare case.

@mcollina
Copy link
Member

mcollina commented Mar 4, 2019

I'm willing to discuss adding new features/options to pipeline as long as the default behavior stays consistent. Send a PR!

@kharandziuk
Copy link

kharandziuk commented Jul 3, 2019

@yosuke-furukawa
I believe you can solve your issue with the PassThrough stream. Look for the code below:

app.post('/resize/:height/:width', (req, res) => {
  const { height, width } = req.params
    const resizer = sharp().resize({
      height: Number(height),
      width: Number(height)
    }).jpeg()
  const psToReq = new stream.PassThrough() // <-----------
  promisify(stream.pipeline)(
        req,
        resizer,
        psToReq // <-----------
  ).catch((e) => {
        console.log(e)
        res.sendStatus(400)
    })
  psToReq.pipe(res) // <-----------
})

@mcollina what do you think? does it make any sense to add this trick(with PassThrough) into the documentation? This problem appears quite often

@mcollina
Copy link
Member

mcollina commented Jul 3, 2019

I don't understand how the above trick helps.

@kharandziuk
Copy link

a changed code sample from @yosuke-furukawa

const fs = require('fs')
const http = require('http')
const stream = require('stream')

const server = http.createServer((req, res) => {
  const r = fs.createReadStream('./foo2.txt')
  const ps = new stream.PassThrough()
  stream.pipeline(
   r,
   ps,
   (err) => {
    if (err) {
      console.log(err) // No such file
      return res.end("error!!!"); // write error response but this message is not shown.
    }
  })
  ps.pipe(res)
})

server.listen(9000)

and we run the server and call:

>curl localhost:9000
error!!!%

the server output:

node index.js
[Error: ENOENT: no such file or directory, open './foo2.txt'] {
  errno: -2,
  code: 'ENOENT',
  syscall: 'open',
  path: './foo2.txt'
}

so, I achieve the expected behavior in a simple way without changing the interface. Do you see my point now?

@mcollina
Copy link
Member

mcollina commented Jul 4, 2019

I still don't understand why adding a passthrough in the middle is going to fix this problem.
I think this is documented behavior to some extent: pipeline destroys all the streams when there is an error. Note that sending res.end() could create issues, because in the generic case we could be in the middle of a transfer - so we should not really send anything.

The correct fix for this issues is to wait that the file is correctly opened, and then use pipeline. If the file cannot be opened correctly, we should send that message to the user. That's what https://www.npmjs.com/package/send does.

@kharandziuk
Copy link

kharandziuk commented Jul 4, 2019

I think this is documented behavior to some extent: pipeline destroys all the streams when there is an error.

yep. but in a case of HttpResponse you don't want to destroy it.

Note that sending res.end()

I just changed the code from @yosuke-furukawa

The correct fix for this issues

it's not about this particular case. This case is just simple enough to show the concept. There is a lot of cases when you need to pipe some read stream to a transformation stream and to response(e.g., check my image resizing sample). And you need a way to handle possible ALL errors which can appear.

it was probably the reason why the original pump from @mafintosh was returning the last stream in the end.

@jasnell jasnell added the feature request Issues that request new features to be added to Node.js. label Jun 26, 2020
@rickyes
Copy link
Contributor

rickyes commented Jun 28, 2020

I am in favour of adding options to pipeline and will send PR if I can.

@mcollina
Copy link
Member

We'll be happy to evaluate that change, send it over.

@ronag
Copy link
Member

ronag commented Jul 1, 2020

What about:

const r = fs.createReadStream('./foo2.txt')
await finished(pipeline(r, transform, (err) => {})
  .on('error', err => {
    console.log(err) // No such file
    res.end("error!!!"); // write error response but this message is not shown. 
  }) 
  .pipe(res)
)

@bmeck
Copy link
Member

bmeck commented Oct 6, 2021

Got bit by this today as it killed some connections eagerly rather than allowing a 500 to be returned.

@RafaelGSS
Copy link
Member

Due to #34133 resolution, this issue is still a thing?

@mcollina
Copy link
Member

mcollina commented Dec 4, 2021

No, this can be closed.

@mcollina mcollina closed this as completed Dec 4, 2021
@mcollina mcollina reopened this Dec 4, 2021
@mcollina
Copy link
Member

mcollina commented Dec 4, 2021

Due to #34133 resolution, this issue is still a thing?

I’m sorry but it does not seem that #34133 is resolved. Why do you think so?

@RafaelGSS
Copy link
Member

There are some disagreements in this feature and no consensus, looks like it was the reason that #34133 was closed. Otherwise, I can re-create that PR with some adjustments.

@mcollina
Copy link
Member

mcollina commented Dec 4, 2021

It seems that PR was closed due to:

Due to lack of sufficient impetus, I will close this PR.

Having a fresh PR would help, I think we can get this landed. Wdyt @ronag?

@ronag
Copy link
Member

ronag commented Dec 4, 2021

I agree.

@rickyes
Copy link
Contributor

rickyes commented Dec 5, 2021

Any good ideas, maybe I can reopen this PR.

@targos targos moved this to Pending Triage in Node.js feature requests Apr 4, 2022
@github-actions
Copy link
Contributor

github-actions bot commented Jun 3, 2022

There has been no activity on this feature request for 5 months and it is unlikely to be implemented. It will be closed 6 months after the last non-automated comment.

For more information on how the project manages feature requests, please consult the feature request management document.

@github-actions github-actions bot added the stale label Jun 3, 2022
@RafaelGSS
Copy link
Member

Closed by #41796

@targos targos moved this from Pending Triage to Stale in Node.js feature requests Apr 4, 2023
@Lordfirespeed
Copy link

Lordfirespeed commented Aug 10, 2024

For others like me looking for a workaround for this:

const makeIndestructible = (stream: NodeJS.WritableStream) => {
  return new Writable({ write: stream.write.bind(stream) })
}

const source = createReadStream(filePath)
await pipeline(source, makeIndestructible(sink))

Although the wrapping Writable is destroyed, the underlying sink stream remains open.

Lordfirespeed added a commit to OtterJS/otterhttp that referenced this issue Aug 10, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature request Issues that request new features to be added to Node.js. stale stream Issues and PRs related to the stream subsystem.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

10 participants