Skip to content

Commit

Permalink
fix(core): collect all logs from forked processes (#27778)
Browse files Browse the repository at this point in the history
<!-- Please make sure you have read the submission guidelines before
posting an PR -->
<!--
https://github.com/nrwl/nx/blob/master/CONTRIBUTING.md#-submitting-a-pr
-->

<!-- Please make sure that your commit message follows our format -->
<!-- Example: `fix(nx): must begin with lowercase` -->

<!-- If this is a particularly complex change or feature addition, you
can request a dedicated Nx release for this pull request branch. Mention
someone from the Nx team or the `@nrwl/nx-pipelines-reviewers` and they
will confirm if the PR warrants its own release for testing purposes,
and generate it for you if appropriate. -->

## Current Behavior

We have a monorepo with ~450 packages and run tasks with 16x parallelism
in CI. We've recently noticed that running tasks with `static` logging
can lead to random tasks losing logs. Even if the task is a simple `echo
SOMETHING`, logs get lost sporadically.

After digging into the task runner, I found that the forked task runner
currently collects logs from `stdout` and `stderr`, then joins those
when the process `exit` event is emitted. The problem appears to stem
from the fact that the `exit` event is _occasionally_ emitted _before_
the `stdout` and `stderr` streams are closed, leading to lost logs.

## Expected Behavior

_All_ logs should be collected, even if they are emitted after the
process `exit` event.

I've updated the task runner to wait for the `stdout` and `stderr`
streams to end before collecting logs.

Note: I didn't see any tests specifically for this file, but it should
still run all the new code if theres is a happy path case of forked
tasks with static logging.
  • Loading branch information
dgreif authored Dec 16, 2024
1 parent 99a0e7c commit c3709b2
Showing 1 changed file with 37 additions and 13 deletions.
50 changes: 37 additions & 13 deletions packages/nx/src/tasks-runner/forked-process-task-runner.ts
Original file line number Diff line number Diff line change
Expand Up @@ -338,29 +338,53 @@ export class ForkedProcessTaskRunner {
}

let outWithErr = [];
let exitCode;
let stdoutHasEnded = false;
let stderrHasEnded = false;
let processHasExited = false;

const handleProcessEnd = () => {
// ensure process has exited and both stdout and stderr have ended before we pass along the logs
// if we only wait for the process to exit, we might miss some logs as stdout and stderr might still be streaming
if (stdoutHasEnded && stderrHasEnded && processHasExited) {
// we didn't print any output as we were running the command
// print all the collected output|
const terminalOutput = outWithErr.join('');
const code = exitCode;

if (!streamOutput) {
this.options.lifeCycle.printTaskTerminalOutput(
task,
code === 0 ? 'success' : 'failure',
terminalOutput
);
}
this.writeTerminalOutput(temporaryOutputPath, terminalOutput);
res({ code, terminalOutput });
}
};

p.stdout.on('data', (chunk) => {
outWithErr.push(chunk.toString());
});
p.stdout.on('end', () => {
stdoutHasEnded = true;
handleProcessEnd();
});
p.stderr.on('data', (chunk) => {
outWithErr.push(chunk.toString());
});
p.stderr.on('end', () => {
stderrHasEnded = true;
handleProcessEnd();
});

p.on('exit', (code, signal) => {
this.processes.delete(p);
if (code === null) code = signalToCode(signal);
// we didn't print any output as we were running the command
// print all the collected output|
const terminalOutput = outWithErr.join('');

if (!streamOutput) {
this.options.lifeCycle.printTaskTerminalOutput(
task,
code === 0 ? 'success' : 'failure',
terminalOutput
);
}
this.writeTerminalOutput(temporaryOutputPath, terminalOutput);
res({ code, terminalOutput });
exitCode = code;
processHasExited = true;
handleProcessEnd();
});
} catch (e) {
console.error(e);
Expand Down

0 comments on commit c3709b2

Please sign in to comment.