-
Notifications
You must be signed in to change notification settings - Fork 29.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Why does node/io.js hang when printing to the console inside a loop? #1741
Comments
I can think of a couple of reasons. The first snippet creates a million strings (i.e. does a million heap allocations) whereas the second one does not; you can verify that with the What's more, if the first argument to Apropos memory usage, It's not until the next tick of the event loop that the memory of that request is freed; that includes the actual string that is written. In both snippets, one million requests are in flight but in the first one, they're much bigger. |
Actually, var x = process.hrtime();
for (var i = 0; i < 1000000; i++) {
console.log('abcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabc1234567890');
}
console.log(process.hrtime(x)); Is almost equally slow. |
See #1749. It does not fix this issue, but makes things slightly better. |
Simple answer: the code in question blocks the process for a bit while the loop is running~~, and libuv doesn't print it until the next tick~~. |
@Fishrock123 I would say that (almost) everything is printed in the same tick. The cause of the lag is gc going crazy at some point and hanging everything for a second or so. Part of this single gc loop:
|
@Fishrock123 FYI, I can reproduce this in async: function f(max) {
for (var i = 0; i < max; i++) {
console.log('abcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabc1234567890');
}
}
function g() {
f(100);
process.nextTick(g);
}
g(); Run with |
Btw, I saw misconceptions about sync code.
|
Another example: function g() {
process.stdout.write('abcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabc1234567890\n');
process.nextTick(g);
}
g(); It's even far worse in this case. When gc starts, it never finishes running (runs over and over again with no actual stdout output), memory usage quickly goes to 1.4 GiB and the process quits with «Allocation failed - process out of memory» error. It looks as if garbage collector blocks output from happening, but the js code still runs, |
If you don't like recursive var i = 0;
function g() {
console.log('abcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabc1234567890');
i++;
if (i % 1000000 === 0) {
setTimeout(g, 1);
} else {
process.nextTick(g);
}
}
g(); Takes more time, but still crashes. Another example (setTimeout for 0 ms every 1e5 iterations): var i = 0;
function g() {
console.log('abcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabc1234567890');
i++;
if (i % 100000 === 0) {
setTimeout(g, 0);
} else {
process.nextTick(g);
}
}
g(); Another one: var s = 'abcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabcabc1234567890';
function g() {
for (var i = 0; i < 1e5; i++) {
console.log(s);
}
setTimeout(g, 0);
}
g(); The numbers might require some tuning if you have a system that is faster than mine. |
Asked on IRC by @Fishrock123 :
Doesn't change anything |
Btw, there is no issue when stdout is piped somewhere (for example, |
One approach to «fixing» this would be to actually block output if the buffer is too large until it is dumped. Almost as it is done in the case when output is piped, but only when the buffer gets overfilled and only until it is freed to a reasonable size. Btw, that could actually make output faster, because with the current implementation it could make garbage collector go crazy (have you noticed gc runs that took 254 ms above?) when the output buffer grows too big. A side question: why is output blocking when it is piped? Could it be for the same reason (everything going boom when it is processed too slowly)? Or are there other reasons? |
Not unless you're on win32. See #1771, could you try that patch? (It might fix this?) |
@Fishrock123, I was talking about this:
https://github.com/nodejs/io.js/blob/v2.1.0/lib/fs.js#L1865
|
`console.log` does some extra work to construct the output, and because it's already a string, we can save some memory by using `process.stdout.write`. See this issue for more information: nodejs/node#1741 (comment) This is an attempt to remedy the OOM issues when rendering all the issues. Increasing the batch size will also help by reducing the number of GCs.
Btw, placing this piece of code before the loop with process.stdout._handle.setBlocking(true); completely fixes the problem. |
@ChALkeR @Fishrock123 ... does this one need to stay open? |
@jasnell To me, this still looks like an issue that has to be solved. Very low priority, though. |
console.log's first argument is a format string according to nodejs/node#1741 (comment)
see nodejs/node#1741 (comment) for reasoning
It shouldn't be? @ChALkeR Is this a bug in libuv or is it a v8 memory issue? |
should this remain open? I am not sure the reported problem is still relevant or not - |
@gireeshpunathil This and #6379 are basically the same issue. Perhaps we can keep only one of those open, I would prefer the latter. |
Okay, let's close this one. |
This is not reliable anymore after AWS changes. See 'nodejs/node#1741 (comment)'
1. Compute deathSignal correctly 2. Redirect output to tmp file. Very long piped strings sometimes get mishandled.
- remove usage of s3 bucket - update required file list, we don't build for arm, mas & linux 32 bit - always download file in quite mode: node crashes with out of heap memory error if we log really fast to stdout nodejs/node#1741
`console.log` does some extra work to construct the output, and because it's already a string, we can save some memory by using `process.stdout.write`. See this issue for more information: nodejs/node#1741 (comment) This is an attempt to remedy the OOM issues when rendering all the issues. Increasing the batch size will also help by reducing the number of GCs.
For anyone running into this issue, and having trouble getting the magic function (
|
Sorry if it is a dumb question/issue, but why does this code:
hang
node
for a few seconds (at around i = 1500) before resuming again? During the hang the amount of memory used by thenode
process increases very fast and the CPU usage is very high during that time.On the other hand, if I write:
then the
node
process does not hang at all.The reason I am asking is that I have a piece of code which involves printing inside a loop, and it also hangs, but for much longer (like 30-60 seconds, before resuming again). The code prints the paths of all files in a directory (walking recursively): https://gist.github.com/pgkos/f0a650daf56aa49899e9.
The issue occurs on both io.js v2.0.2 and node.js v0.12.2, on Ubuntu 64-bit.
The text was updated successfully, but these errors were encountered: