Skip to content
This repository has been archived by the owner on Jun 9, 2024. It is now read-only.

feat!: migrate to fastify org #1

Merged
merged 23 commits into from
Sep 29, 2022
Merged

feat!: migrate to fastify org #1

merged 23 commits into from
Sep 29, 2022

Conversation

Eomm
Copy link
Member

@Eomm Eomm commented Sep 25, 2022

still need test to improve coverage

feel free to commit on this branch 👍🏽

@Eomm Eomm requested review from Uzlopak and zekth September 25, 2022 10:17
@Uzlopak
Copy link
Contributor

Uzlopak commented Sep 26, 2022

@Eomm
Can you add this package to fastify/plugins team? I have no write permissions

@Eomm
Copy link
Member Author

Eomm commented Sep 26, 2022

fixed 👍🏽

@Uzlopak
Copy link
Contributor

Uzlopak commented Sep 26, 2022

I thought about the plugin.

First of all currently it is called reply.eh for early hints. we should probably rename that to make the attribute self explaining.

Second I think the implementation should be more configured through routeOptions instead of setting the entryHints in the handler. Maybe add an onRouteHandler which makes it possible to set earlyHints per Route.

This brings us to the third issue: formatEntry is currently called on every request. So if we pass a EarlyHintItem it gets serialized every time. So if we actually serialize only once, we can avoid alot of redundant computations. So setting earlyHints as routeOption would make it possible to serialize the earlyHints once.

@zekth
@Eomm

@climba03003
Copy link
Member

climba03003 commented Sep 27, 2022

  1. We should update and change the implementation to support headers.
    See http: add writeEarlyHints function to ServerResponse nodejs/node#44180 (comment)

  2. Use decorateReply and use this pointer to reply. There's no need of wrapper function and onRequest hook.

Second I think the implementation should be more configured through routeOptions instead of setting the entryHints in the handler.

The current implementation is perfectly fine. route options means we lose flexibility to customize and calling early hints multiple times.

@Eomm Eomm marked this pull request as ready for review September 27, 2022 06:32
@Eomm
Copy link
Member Author

Eomm commented Sep 27, 2022

I would track all the new features (such headers) to dedicated issues:

This PR accomplish its target and we can focus on which issues must be done before releasing this version.

@Uzlopak
Copy link
Contributor

Uzlopak commented Sep 27, 2022

Well. I think I did my part then :). I trust my code changes. So maybe somebody else should review it?!

If we use the writeEarlyHints function we need to use combined test coverage of different test runs as we support node 14 but early hints will most likely only be backported to latest node 16.

The default workflow does not support combined test coverage.

@@ -76,11 +29,7 @@ function fastifyEH (fastify, opts, next) {
},
add: function (content) {
const p = new Promise((resolve) => {
if (reply.raw.socket) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Help me remember, this is a Fastify 3 way right?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I reverted this. Description below

@Uzlopak
Copy link
Contributor

Uzlopak commented Sep 27, 2022

Added warn option. So by default warning is disabled. This basically improves the performance of the formatEntry significantly.

@Uzlopak
Copy link
Contributor

Uzlopak commented Sep 27, 2022

I reverted that part regarding reply.raw.socket.write.

Unit tests are green if only use reply.raw.socket.write. But If I use the example (which I provided) and autocannon it, it gets internal server errors. So under pressure the sockets are sometimes null.

So i reverted that change as it seems that it fixes that 500 errors.

But still it smells.

uzlopak@Battlestation:~/workspace/fastify-early-hints$ autocannon http://localhost:3000 --renderStatusCodes
Running 10s test @ http://localhost:3000
10 connections


┌─────────┬──────┬──────┬───────┬──────┬─────────┬─────────┬───────┐
│ Stat    │ 2.5% │ 50%  │ 97.5% │ 99%  │ Avg     │ Stdev   │ Max   │
├─────────┼──────┼──────┼───────┼──────┼─────────┼─────────┼───────┤
│ Latency │ 0 ms │ 0 ms │ 0 ms  │ 0 ms │ 0.21 ms │ 2.98 ms │ 49 ms │
└─────────┴──────┴──────┴───────┴──────┴─────────┴─────────┴───────┘
┌───────────┬─────────┬─────────┬───────┬─────────┬─────────┬─────────┬─────────┐
│ Stat      │ 1%      │ 2.5%    │ 50%   │ 97.5%   │ Avg     │ Stdev   │ Min     │
├───────────┼─────────┼─────────┼───────┼─────────┼─────────┼─────────┼─────────┤
│ Req/Sec   │ 5243    │ 5243    │ 32559 │ 44063   │ 31407.4 │ 10341.5 │ 5242    │
├───────────┼─────────┼─────────┼───────┼─────────┼─────────┼─────────┼─────────┤
│ Bytes/Sec │ 2.04 MB │ 2.04 MB │ 13 MB │ 17.7 MB │ 12.6 MB │ 4.18 MB │ 2.04 MB │
└───────────┴─────────┴─────────┴───────┴─────────┴─────────┴─────────┴─────────┘
┌──────┬────────┐
│ Code │ Count  │
├──────┼────────┤
│ 103  │ 95958  │
├──────┼────────┤
│ 200  │ 218097 │
└──────┴────────┘

Req/Bytes counts sampled once per second.
# of samples: 10

218097 2xx responses, 95958 non 2xx responses
314k requests in 10.01s, 126 MB read

Imho: The amount of 103 codes should be the same as 200 codes. But they arent.

@zekth
Copy link
Member

zekth commented Sep 27, 2022

Are we sure autocannon if handling properly the 100-199 codes? Can you write a log line in a case of no possible to write to socket to ensure you get issues with the autocannon?

})
fastify.listen({ port: 3001 }, (err) => {
t.error(err)
exec('curl -D - http://localhost:3001').then(({ stdout, stderr }) => {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think client-request can handle 103.
103 is a special case, it contains at least two status code for a single request.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Look the test closely, it adds in an info array

@Uzlopak
Copy link
Contributor

Uzlopak commented Sep 27, 2022

I think we should change the logic to

  const serialize = function (message) {
    return `HTTP/1.1 103 Early Hints${CRLF}${message}${CRLF}`
  }

  function earlyHints (reply) {
    let message = ''

    return {
      close: function (done) {
        if (message) {
          if (reply.raw.socket) {
            reply.raw.socket.write(serialize(message), 'utf-8', done)
          } else {
            reply.raw.write(serialize(message), 'utf-8', done)
          }
        } else {
          done()
        }
      },
      add: function (content) {
        for (let i = 0; i < content.length; ++i) {
          message += `${formatEntry(content[i], formatEntryOpts)}${CRLF}`
        }
      }
    }
  }

So we would send only once 103 and not every time we do a .add-call.

See https://chromium.googlesource.com/chromium/src/+/master/docs/early-hints.md#what_s-not-supported

Chrome ignores the second and following Early Hints responses. Chrome only handles the first Early Hints response so that Chrome doesn't apply inconsistent security policies (e.g. Content-Security-Policy).

@climba03003
Copy link
Member

climba03003 commented Sep 27, 2022

we would send only once 103 and not every time we do a .add-call.

We should follow the RFC instead of browser behavior.
It is up to the client to decide how to handle early hints, but as a server we should not limit how it is sent.

@Uzlopak
Copy link
Contributor

Uzlopak commented Sep 27, 2022

I patched it to give me some statistics. And it does not make sense. Maybe autocannon sends more requests and kills them after the time is over. But I would expect some consistency. Like it should have equal or more 103 statusCodes than counterSocket.

┌──────┬───────┐
│ Code │ Count │
├──────┼───────┤
│ 103  │ 12992 │
├──────┼───────┤
│ 200  │ 30316 │
└──────┴───────┘

{ counterReply: 26123, counterSocket: 17195 }

To reproduce:

autocannon http://localhost:3000 --renderStatusCodes

the patched code:

'use strict'
const fp = require('fastify-plugin')
const formatEntry = require('./lib/formatEntry')
const CRLF = '\r\n'

function fastifyEarlyHints (fastify, opts, next) {
  if (fastify.initialConfig.http2 === true) {
    return next(new Error('Early Hints cannot be used with a HTTP2 server.'))
  }

  const formatEntryOpts = {
    warn: opts.warn
  }
  // initialize counters 
  let counterSocket = 0 
  let counterReply = 0
  function earlyHints (reply) {
    const promiseBuffer = []
    const serialize = function (c) {
      let message = `HTTP/1.1 103 Early Hints${CRLF}`
      for (let i = 0; i < c.length; i++) {
        message += `${formatEntry(c[i], formatEntryOpts)}${CRLF}`
      }
      return `${message}${CRLF}`
    }

    return {
      close: async function () {
        if (promiseBuffer.length) {
          await Promise.all(promiseBuffer)
        }
      },
      add: function (content) {
        const p = new Promise(resolve => {
          if (reply.raw.socket) {
            counterSocket++ // increment counter
            reply.raw.socket.write(serialize(content), 'utf-8', resolve)
          } else {
            counterReply++ // increment counter
            reply.raw.write(serialize(content), 'utf-8', resolve)
          }
        })
        promiseBuffer.push(p)
        return p
      }
    }
  }

  function onRequest (request, reply, done) {
    reply.eh = earlyHints(reply)
    done()
  }

  async function onSend (request, reply, payload) {
    await reply.eh.close()
  }

  fastify.addHook('onRequest', onRequest)
  fastify.addHook('onSend', onSend)

  next()
  process.on('SIGINT', () => {console.log({ counterReply, counterSocket }), process.exit(0)})
}

module.exports = fp(fastifyEarlyHints, {
  fastify: '4.x',
  name: '@fastify/early-hints'
})

So run example, run the autocannon, ctrl+c in example and see the console.log.

@Uzlopak
Copy link
Contributor

Uzlopak commented Sep 27, 2022

@mcollina
Do you have any idea what the issue could be?

@mcollina
Copy link
Member

mcollina commented Sep 28, 2022

I patched it to give me some statistics. And it does not make sense. Maybe autocannon sends more requests and kills them after the time is over. But I would expect some consistency.

If you want to send a fixed number of requests, use the -a flag in autocannon, otherwise the number of requests is inconsistent as it stops after T seconds.

@Uzlopak
Copy link
Contributor

Uzlopak commented Sep 28, 2022

I added a stress test with autocanonnen. So now the unit tests are green, because we cover everything.

@mcollina
It seems that autocannon handles the 103 statuscode for the calculation of total amount of requests.

So for 10000 requests with autocannon:

┌──────┬───────┐
│ Code │ Count │
├──────┼───────┤
│ 103  │ 3051  │
├──────┼───────┤
│ 200  │ 6949  │
└──────┴───────┘

3051+5949 = 10000

@Uzlopak
Copy link
Contributor

Uzlopak commented Sep 28, 2022

I personally would expect 5000 103 and 5000 200.

@Uzlopak
Copy link
Contributor

Uzlopak commented Sep 28, 2022

So I would actually recommend to modify the actual functionality.

So instead of having add and close, we should decoreReply with 'writeEarlyHints'

'use strict'

const fp = require('fastify-plugin')
const formatEntry = require('./lib/formatEntry')
const CRLF = '\r\n'
const earlyHintsHeader = `HTTP/1.1 103 Early Hints${CRLF}`

function fastifyEarlyHints (fastify, opts, next) {
  if (fastify.initialConfig.http2 === true) {
    return next(new Error('Early Hints cannot be used with a HTTP2 server.'))
  }

  const formatEntryOpts = {
    warn: opts.warn
  }

  const serialize = function (c) {
    let message = ``
    for (let i = 0; i < c.length; i++) {
      message += `${formatEntry(c[i], formatEntryOpts)}${CRLF}`
    }
    return `${earlyHintsHeader}${message}${CRLF}`
  }

  fastify.decorateReply('writeEarlyHints', function (earlyHints) {
    const raw = this.raw
    const serialized = serialize(earlyHints)
    return new Promise(resolve => {
      if (raw.socket) {
        raw.socket.write(serialized, 'utf-8', resolve)
      } else {
        raw.write(serialized, 'utf-8', resolve)
      }
    })
  })

  next()
}

module.exports = fp(fastifyEarlyHints, {
  fastify: '4.x',
  name: '@fastify/early-hints'
})

So what happens then is that we have to await the writeEarlyHints in the handler, as the close function would not await to write the early hints.

Also I think the node implementation differs from our implementation. We have cors, but node also has some other fields like title or anchor
https://github.com/wingleung/node/blob/c204024fcaa97ac74270f4270ce69c9dccd9a1dd/lib/internal/validators.js#L262

@Uzlopak
Copy link
Contributor

Uzlopak commented Sep 28, 2022

@zekth

nodejs is using ascii instead of utf-8 to write to socket. Should we also switch to ascii?

@zekth
Copy link
Member

zekth commented Sep 28, 2022

nodejs is using ascii instead of utf-8 to write to socket. Should we also switch to ascii?

I guess so. Good catch.

@Eomm
Copy link
Member Author

Eomm commented Sep 28, 2022

3051+5949 = 10000

The count shows 6949, 11k 😖

@Uzlopak
Copy link
Contributor

Uzlopak commented Sep 28, 2022

@Eomm
@climba03003
@zekth

What should we do now?

The codebase has 100% code coverage and is with no breaking change (except fastify 4.0 requirement).

Should we merge this and then have some more follow up PRs? Or should we do everything in this PR?

I would prefer the first. We can merge this and then we can have a PR regarding

  • use ascii instead of utf-8 and if we need some more checks
  • use of native writeEarlyHints
  • use reply decorator instead of two hooks

@Uzlopak
Copy link
Contributor

Uzlopak commented Sep 29, 2022

I think autocannon does not work correctly as it mixes sometimes 103 with 200. If I put the example.js under stress test with autocannon -f and then run a curl request it is always correct order etc..

@Uzlopak
Copy link
Contributor

Uzlopak commented Sep 29, 2022

creating follow up PRs

@Uzlopak Uzlopak merged commit ceae16c into master Sep 29, 2022
@Uzlopak Uzlopak deleted the f-org branch September 29, 2022 08:26
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants