-
Notifications
You must be signed in to change notification settings - Fork 30.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
stream.pipeline swallowing errors when read stream is empty #24517
Comments
@nodejs/streams |
pipeline is behaving as expected. The whole pipeline is teared down before the file descriptor is actually opened, as the readable stream ends. |
The thing is callback called once, when stream read, and write error goes nowhere. In pipe-io everything works as expected, but I want to use |
It seems you are suggesting to call the callback more than once. The goal of that callback is to be definitive on when the actual pipeline is teared down, i.e.
It is still not clear to me what is the expected behavior. Can you make an example? |
Callback should definitely be called once, when all streams closed, or error occurs in one of them, but not on the middle of way.
There is code example. Expected behavior is to return I'm working on file manager for the web, which is uses restafary for file operations made with help of As I said |
It is not possible to ensure in a generic way that all streams have been teared down. Specifically, there is no guarantee that a close event will be emitted, or it's not possible to pass a callback to Looking at the pipe-io code, it detects for certain special cases (including I'm happy to review a PR that implements such a feature, but I do not see how it would be possible in a generic way.
That snippet shows the current behavior and what happens with pipe. I would like to understand what change you need in
I think a safest approach is to wait for an |
I expect to receive first error in a callback, when some of streams emit error, or receive nothing when everything is done with no errors.
Is it better to use code like this in a userland: const {Readable} = require('stream');
const {createWriteStream} = require('fs');
const {pipeline} = require('stream');
const readStream = new Readable({
read() {}
});
readStream.push(null);
const writeStream = createWriteStream('/');
writeStream.on('open', (e) => {
if (e)
return consol.error(e);
pipeline(readStream, writeStream, (e) => {
console.log(e);
});
});
That what
|
How could we detect in a generic way if a stream needs to wait for |
Actually there is specific case check for request in a pipeline :). The thing is all specific cases of streams is mostly built-in streams, and what If it is also no-go solution, I see no reason why built-in method of node.js I agree with you that in a long run check every case it is a bad and not generic solution. But for developer of userland modules to remember about every strange case like this one: writeStream.on('open', (e) => {
if (e)
return consol.error(e);
pipeline(readStream, writeStream, (e) => {
console.log(e);
});
}); It is also not a solution at all, for now pipe-io does it thing and not swallow errors. Maybe in a future some better solution came-up and we will have ability to just use |
@nodejs/stream ... it's not clear if this is actionable or not. |
Overall I don't think this is fully fixable if we would like to retain compatibility with older streams. We need to wait:
Node of this is possible with older streams. |
That’s my read as well, closing. |
10.12.0
,11.2.0
Linux cloudcmd.io 4.4.0-122-generic #146-Ubuntu SMP Mon Apr 23 15:34:04 UTC 2018 x86_64 x86_64 x86_64 GNU/Linux
Stream
pipeline
has inconsistent behavior withpipe
, it is swallows writable errors, when readable stream is empty, for example, such code will logundefined
:But If we use
pipe
with a code:We will get such an error:
Would be great if
pipeline
has the same behaviorpipe
has :).The text was updated successfully, but these errors were encountered: