-
Notifications
You must be signed in to change notification settings - Fork 29.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
stream.compose(...) doesn't destroy all active composed streams when it is destroyed #51987
Labels
stream
Issues and PRs related to the stream subsystem.
Comments
headlessme
changed the title
stream.compose(...) doesn't destroy all active streams when it is destroyed
stream.compose(...) doesn't destroy all active composed streams when it is destroyed
Mar 6, 2024
sophonieb
pushed a commit
to sophonieb/node
that referenced
this issue
Jun 20, 2024
PR-URL: nodejs#53213 Fixes: nodejs#51987 Reviewed-By: Robert Nagy <ronagy@icloud.com> Reviewed-By: Matteo Collina <matteo.collina@gmail.com> Reviewed-By: Benjamin Gruenbaum <benjamingr@gmail.com>
bmeck
pushed a commit
to bmeck/node
that referenced
this issue
Jun 22, 2024
PR-URL: nodejs#53213 Fixes: nodejs#51987 Reviewed-By: Robert Nagy <ronagy@icloud.com> Reviewed-By: Matteo Collina <matteo.collina@gmail.com> Reviewed-By: Benjamin Gruenbaum <benjamingr@gmail.com>
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Version
20.11.0
Platform
Darwin MacBook-Pro-4.local 23.1.0 Darwin Kernel Version 23.1.0: Mon Oct 9 21:27:27 PDT 2023; root:xnu-10002.41.9~6/RELEASE_X86_64 x86_64
Subsystem
stream
What steps will reproduce the bug?
When calling
destroy(err)
on a stream created usingstream.compose()
some streams that may still be actively processing data do not have_destroy
called and do not emit anerror
event. Illustrated by the code below.If there is a stream that slowly processes data as the last entry in
compose(...)
and all it's input has been written (finish
event has fired) and thendestroy()
is called on the composed stream, I was expecting the slow processing stream to also be destroyed and emit the error.Inserting a
PassThrough
stream as the last entry in the compose chain seems to fix the issue (uncomment lines below in sample code), but it's unclear to me why that is. Either I've misunderstood something aboutcompose()
ing streams or there's a bug.How often does it reproduce? Is there a required condition?
Reliably reproduces using sample above.
What is the expected behavior? Why is that the expected behavior?
I'd expect any stream that's part of a
compose(...)
chain that is still processing data to emit errors when the outer stream is destroyed.I'd expect output from the above script something like:
What do you see instead?
The
SlowProcessor
stream does not get destroyed even though itsReadable
side has notend
ed.When there's an extra
PassThrough
inserted as the last in the compose chain, the results look as I'd expect:Additional information
No response
The text was updated successfully, but these errors were encountered: