Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

When having single transport in pipelining the worker exit in the middle #1774

Closed
rluvaton opened this issue Aug 7, 2023 · 17 comments
Closed
Labels

Comments

@rluvaton
Copy link
Contributor

rluvaton commented Aug 7, 2023

I know pipelining is intended to have multiple transports but in our tests, we sometimes don't have transport and only have our transformer as a transport.

Reproduction:

Given this code:

const pino = require("pino");
const { setTimeout } = require("timers/promises");

const log = pino({
  level: "info",
  transport: {
    pipeline: [
      {
        target: "./empty",
      },
    ],
  },
});

async function run() {
  while (true) {
    log.info("Hello World!");
    await setTimeout(1);
  }
}

run();

and this transport :

// empty.js
const build = require("pino-abstract-transport");

module.exports = {
    default: async () =>
        build(async function* (source) {
                for await (const logEntry of source) {
                    yield ''
                }
            },
            {
                // This is needed to be able to pipeline transports.
                enablePipelining: false,
            }),
};

I get:

node:events:491
      throw er; // Unhandled 'error' event
      ^

Error: the worker has exited
    at ThreadStream.write (~/dev/tmp/pino-bug/node_modules/thread-stream/index.js:229:19)
    at Pino.write (~/dev/tmp/pino-bug/node_modules/pino/lib/proto.js:217:10)
    at Pino.LOG [as info] (~/dev/tmp/pino-bug/node_modules/pino/lib/tools.js:71:21)
    at run (~/dev/tmp/pino-bug/index.js:20:9)
Emitted 'error' event on ThreadStream instance at:
    at Immediate.<anonymous> (~/dev/tmp/pino-bug/node_modules/thread-stream/index.js:360:12)
    at process.processImmediate (node:internal/timers:476:21)

Node.js v18.16.0

The workaround is to add another proxy transport...

@mcollina
Copy link
Member

mcollina commented Aug 8, 2023

Why are you using a transport in tests to do nothing? You can just set the log level extremely high and nothing will be logged.

@rluvaton
Copy link
Contributor Author

rluvaton commented Aug 8, 2023

in our code, we have transport that changes some stuff on the payload (in this example, I made the transport do nothing for the sake of the reproduction), and depending on configuration we add more transports (console or UDP)

if our tests we disable both console and UDP so we only left with that first transport

@mcollina
Copy link
Member

mcollina commented Aug 8, 2023

I'm not sure what the fix for this would be. You are creating a transport that yield data with no actual destination, what do you think pino should do in this case?

@rluvaton
Copy link
Contributor Author

rluvaton commented Aug 8, 2023

I think it should either warn about this or drop the data, Why does it work with 2 targets that are the same?

FYI, even when we don't yield anything this has the same result...

@mcollina
Copy link
Member

mcollina commented Aug 8, 2023

I think a better error message for this. Using pipelining must require at least 2 streams. Take a look at this

pipeline(stream, ...streams, function (err) {
: the pipeline function would be called with only one stream, erroring.

Would you like to send a PR?

@rluvaton
Copy link
Contributor Author

rluvaton commented Aug 8, 2023

Should it log/throw error if the only stream is writable (or don't emit data to the next)?

@mcollina
Copy link
Member

mcollina commented Aug 8, 2023

Yes it should throw an error in this case

@rluvaton
Copy link
Contributor Author

rluvaton commented Aug 8, 2023

wouldn't this be a breaking change?

@mcollina
Copy link
Member

mcollina commented Aug 8, 2023

It does not seem the code is working at all right now, what will it break?
What we need is a better error message for this situation.

@rluvaton
Copy link
Contributor Author

rluvaton commented Aug 8, 2023

I just checked and this will be a breaking change as when the transport consumes the data everything work:

Example:

const pino = require('pino');
const fs = require('fs');
fs.writeFileSync('./to-file-transport.js', `
'use strict'

const fs = require('fs')
const { once } = require('events')

async function run (opts) {
  if (!opts.destination) throw new Error('kaboom')
  const stream = fs.createWriteStream(opts.destination)
  await once(stream, 'open')
  return stream
}

module.exports = run
`);

let outputPath = './file-output.js'
fs.writeFileSync(outputPath, '')

const transport = pino.transport({
  pipeline: [{
    target: './to-file-transport.js',
    options: { destination: outputPath }
  }]
})
const instance = pino(transport)

setInterval(() => {
  const output = (fs.readFileSync(outputPath)).toString();
  console.log(output)
}, 1000);

async function run() {
  while(true) {
    instance.info({ hello: 'world' }, 'hello')
    await new Promise(resolve => setTimeout(resolve, 1))
  }
}
run();

Also when not using the pino-abstract-transport if not yielding data it won't happen

@mcollina
Copy link
Member

mcollina commented Aug 8, 2023

Sure, but this transport results in a Writable, not in a Duplex https://github.com/pinojs/pino-abstract-transport/blob/b6d7973a80b115b9bad3098efb669279ff9667c9/index.js#L69-L7. If the last stream is a readable, the pipeline will not flow, likely causing a crash somewhere. I think just intercepting that the last stream is a Readable and throwing a nice-to-understand error is fine.

The question is finding out exactly why the worker is crashing and then intercepting this condition early.

@rluvaton
Copy link
Contributor Author

rluvaton commented Aug 8, 2023

This transport will now fail when it wasn't before (as it returns a Transform and not Writable):
https://github.com/pinojs/pino/blob/8dceecdeb6eb4e96b348277bf908496b7880917a/test/fixtures/to-file-transport-with-transform.js

'use strict'

const fs = require('fs')
const { once } = require('events')
const { Transform } = require('stream')

async function run (opts) {
  if (!opts.destination) throw new Error('kaboom')
  const stream = fs.createWriteStream(opts.destination)
  await once(stream, 'open')
  const t = new Transform({
    transform (chunk, enc, cb) {
      setImmediate(cb, null, chunk.toString().toUpperCase())
    }
  })
  t.pipe(stream)
  return t
}

module.exports = run

@rluvaton
Copy link
Contributor Author

rluvaton commented Aug 8, 2023

and it will break pino-pretty as it returns a Transform:

(use this with the PR and see it's failing)

const pino = require("pino");
const { setTimeout } = require("timers/promises");

const log = pino({
  level: "info",
  transport: {
    pipeline: [
      {
        target: "pino-pretty",
        options: { destination: 1 } // use 2 for stderr
      },
    ],
  },
});

async function run() {
  let i = 0;
  while (true) {
    log.info(`Hello World! ${i++}`);
    await setTimeout(1);
  }
}

run();

@mcollina
Copy link
Member

mcollina commented Aug 8, 2023

I don't understand then why your example at the top fails while pino-pretty does not have this problem.

@rluvaton
Copy link
Contributor Author

rluvaton commented Aug 8, 2023

that's the question, yesterday I debugged it for hours and maybe it's related to the fact the needDrain is true

and drain is never called (I checked by adding an interval to keep the process alive and it never called)

@rluvaton
Copy link
Contributor Author

Fixed by emitting a warning in thread-stream

@github-actions
Copy link

This issue has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Sep 11, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants