-
Notifications
You must be signed in to change notification settings - Fork 27k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add _document and _app pre-import #23261
Conversation
This comment has been minimized.
This comment has been minimized.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi, I'm gonna close this as it seems like further is discussion is needed/benchmarking done to see if this has any impact. Please open a discussion to continue investigation!
I don't think we need further discussion on this issue. There is no slowdown because _app and _document are loaded for every page. |
There is the case of API routes which have no need for |
I think the bigger question here is should I get billed on Vercel for NextJS wasting 10 seconds to additionaly compile |
Oh, sorry @abriginets for long time without answer. An opt-in configuration option has been added in this PR to allow developers to make decisions about whether to pre-import or not. And it's not enough to minimize TTFB, because there are only But such a workaround is slow down the whole server start. And I see the only one way to get a fast start for statically optimized pages and slow start for low TTFB with SSR. This way is to use two different servers: one for statically optimized pages and other for SSR. But you should keep in mind that all my decisions and suggestions are about standalone deployment, not in Vercel. I can contribute only to free open source part. Not to a third-party cloud platform. |
Just to be clear, Next.js does not ship the compilation to production, there is no bundling or anything you see in dev to compile code on the fly. The only thing that happens is that |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
pre-require
ing the _app and _document is probably fine for the standalone mode @ijjk. I think this PR is fine to land without experimental opt-in if it checks for minimalMode
, especially now that the target
option is deprecated we can make improvements like these.
Just to be clear to future readers: "precompiling" here refers to requiring the code early which runs the top-level module code for _app and _document. It has nothing to do with compiling/bundling the files (e.g. through webpack).
@timneutkens for some reason Next.js deployed to Vercel was rendering every SSR page in my project for 10-15 seconds each time lambda was getting cold. I decided to move my project to Heroku (non-serverless env) and all those long cold starts stopped. If I redeploy my app to Heroku (which mean purging all Next.js cache) right now and try to hit any of the SSR page it would load for 2-3 seconds for the first time and then there will be no delays whatsoever for any subsequent request. I figured out that serverless container cold starts was not an issue (been monitoring logs for quite some time) as it always took no more than 400ms. For me it looks like Next.js is missing something in serverless env (something that is being persisted at the filesystem on Heroku or any other traditional hosting) and it needs to generate some data, or some cache, etc. before it could execute backend logic and serve html to users. I've been trying to dig up at least something but I couldn't find anything so I assumed that it might be dynamic imports, requires, etc. This leads me into thinking that Vercel is not an option for hosting SSR/ISR-based Next.js apps at least for now. |
Without a reproducible URL, it's hard to validate or verify this - but 10-15 is definitely not a lambda cold start IMO.
This sounds more in line with cold start times I've seen (400ms).
One of the difficult parts about being a hosting platform is that there's only so much optimization you can do for user code. So in some instances, if code is written that takes 10s to fetch from an external API, there's not much you can do about it from the Vercel side. There are many, many SSR / ISR Next.js applications on Vercel. Almost all of these: https://vercel.com/customers |
Do you mean that we should check only I see two options:
We can also make more complex and flexible configuration.
|
@leerob is there a chance that those customers host their backends somewhere else instead of storing all the codebase in Next.js as monorepo? Maybe my issue with Next.js lies in extensive backend codebase leading Next.js to struggle with it at the runtime? |
# Conflicts: # packages/next/server/config-shared.ts # packages/next/server/next-server.ts
Stats from current PRDefault Build (Increase detected
|
vercel/next.js canary | gwer/next.js app_document_pre_warm | Change | |
---|---|---|---|
buildDuration | 19.7s | 19.4s | -286ms |
buildDurationCached | 7.4s | 7.5s | |
nodeModulesSize | 359 MB | 359 MB |
Page Load Tests Overall increase ✓
vercel/next.js canary | gwer/next.js app_document_pre_warm | Change | |
---|---|---|---|
/ failed reqs | 0 | 0 | ✓ |
/ total time (seconds) | 4.235 | 4.177 | -0.06 |
/ avg req/sec | 590.33 | 598.45 | +8.12 |
/error-in-render failed reqs | 0 | 0 | ✓ |
/error-in-render total time (seconds) | 2.173 | 2.116 | -0.06 |
/error-in-render avg req/sec | 1150.49 | 1181.39 | +30.9 |
Client Bundles (main, webpack, commons)
vercel/next.js canary | gwer/next.js app_document_pre_warm | Change | |
---|---|---|---|
450.HASH.js gzip | 179 B | 179 B | ✓ |
framework-HASH.js gzip | 42 kB | 42 kB | ✓ |
main-HASH.js gzip | 27.9 kB | 27.9 kB | ✓ |
webpack-HASH.js gzip | 1.44 kB | 1.44 kB | ✓ |
Overall change | 71.5 kB | 71.5 kB | ✓ |
Legacy Client Bundles (polyfills)
vercel/next.js canary | gwer/next.js app_document_pre_warm | Change | |
---|---|---|---|
polyfills-HASH.js gzip | 31 kB | 31 kB | ✓ |
Overall change | 31 kB | 31 kB | ✓ |
Client Pages
vercel/next.js canary | gwer/next.js app_document_pre_warm | Change | |
---|---|---|---|
_app-HASH.js gzip | 1.36 kB | 1.36 kB | ✓ |
_error-HASH.js gzip | 194 B | 194 B | ✓ |
amp-HASH.js gzip | 312 B | 312 B | ✓ |
css-HASH.js gzip | 326 B | 326 B | ✓ |
dynamic-HASH.js gzip | 2.57 kB | 2.57 kB | ✓ |
head-HASH.js gzip | 350 B | 350 B | ✓ |
hooks-HASH.js gzip | 919 B | 919 B | ✓ |
image-HASH.js gzip | 5.01 kB | 5.01 kB | ✓ |
index-HASH.js gzip | 263 B | 263 B | ✓ |
link-HASH.js gzip | 2.26 kB | 2.26 kB | ✓ |
routerDirect..HASH.js gzip | 321 B | 321 B | ✓ |
script-HASH.js gzip | 383 B | 383 B | ✓ |
withRouter-HASH.js gzip | 318 B | 318 B | ✓ |
85e02e95b279..7e3.css gzip | 107 B | 107 B | ✓ |
Overall change | 14.7 kB | 14.7 kB | ✓ |
Client Build Manifests
vercel/next.js canary | gwer/next.js app_document_pre_warm | Change | |
---|---|---|---|
_buildManifest.js gzip | 459 B | 459 B | ✓ |
Overall change | 459 B | 459 B | ✓ |
Rendered Page Sizes
vercel/next.js canary | gwer/next.js app_document_pre_warm | Change | |
---|---|---|---|
index.html gzip | 531 B | 531 B | ✓ |
link.html gzip | 544 B | 544 B | ✓ |
withRouter.html gzip | 526 B | 526 B | ✓ |
Overall change | 1.6 kB | 1.6 kB | ✓ |
Default Build with SWC (Decrease detected ✓)
General Overall increase ⚠️
vercel/next.js canary | gwer/next.js app_document_pre_warm | Change | |
---|---|---|---|
buildDuration | 23.7s | 23.9s | |
buildDurationCached | 7.4s | 7.6s | |
nodeModulesSize | 359 MB | 359 MB |
Page Load Tests Overall decrease ⚠️
vercel/next.js canary | gwer/next.js app_document_pre_warm | Change | |
---|---|---|---|
/ failed reqs | 0 | 0 | ✓ |
/ total time (seconds) | 4.391 | 4.232 | -0.16 |
/ avg req/sec | 569.36 | 590.72 | +21.36 |
/error-in-render failed reqs | 0 | 0 | ✓ |
/error-in-render total time (seconds) | 2.079 | 2.172 | |
/error-in-render avg req/sec | 1202.22 | 1151.24 |
Client Bundles (main, webpack, commons)
vercel/next.js canary | gwer/next.js app_document_pre_warm | Change | |
---|---|---|---|
450.HASH.js gzip | 179 B | 179 B | ✓ |
framework-HASH.js gzip | 42.1 kB | 42.1 kB | ✓ |
main-HASH.js gzip | 27.9 kB | 27.9 kB | ✓ |
webpack-HASH.js gzip | 1.44 kB | 1.44 kB | ✓ |
Overall change | 71.6 kB | 71.6 kB | ✓ |
Legacy Client Bundles (polyfills)
vercel/next.js canary | gwer/next.js app_document_pre_warm | Change | |
---|---|---|---|
polyfills-HASH.js gzip | 31 kB | 31 kB | ✓ |
Overall change | 31 kB | 31 kB | ✓ |
Client Pages
vercel/next.js canary | gwer/next.js app_document_pre_warm | Change | |
---|---|---|---|
_app-HASH.js gzip | 1.35 kB | 1.35 kB | ✓ |
_error-HASH.js gzip | 180 B | 180 B | ✓ |
amp-HASH.js gzip | 305 B | 305 B | ✓ |
css-HASH.js gzip | 321 B | 321 B | ✓ |
dynamic-HASH.js gzip | 2.56 kB | 2.56 kB | ✓ |
head-HASH.js gzip | 342 B | 342 B | ✓ |
hooks-HASH.js gzip | 911 B | 911 B | ✓ |
image-HASH.js gzip | 5.05 kB | 5.05 kB | ✓ |
index-HASH.js gzip | 256 B | 256 B | ✓ |
link-HASH.js gzip | 2.28 kB | 2.28 kB | ✓ |
routerDirect..HASH.js gzip | 314 B | 314 B | ✓ |
script-HASH.js gzip | 375 B | 375 B | ✓ |
withRouter-HASH.js gzip | 309 B | 309 B | ✓ |
85e02e95b279..7e3.css gzip | 107 B | 107 B | ✓ |
Overall change | 14.7 kB | 14.7 kB | ✓ |
Client Build Manifests
vercel/next.js canary | gwer/next.js app_document_pre_warm | Change | |
---|---|---|---|
_buildManifest.js gzip | 458 B | 458 B | ✓ |
Overall change | 458 B | 458 B | ✓ |
Rendered Page Sizes
vercel/next.js canary | gwer/next.js app_document_pre_warm | Change | |
---|---|---|---|
index.html gzip | 533 B | 533 B | ✓ |
link.html gzip | 547 B | 547 B | ✓ |
withRouter.html gzip | 527 B | 527 B | ✓ |
Overall change | 1.61 kB | 1.61 kB | ✓ |
In our integration tests for nextjs, we use `nock` to intercept outgoing http requests, which lets us both examine a request’s payload and mock its response. As part of its initialization, `nock` monkeypatches the `get` and `request` methods in both the `http` and `https` Node built-in modules. Our `Http` integration also monkeypatches those methods. The order in which the two monkeypatching operations happen matters. If `nock` monkeypatches before Sentry does we end up with `sentryWrapper(nockWrapper(originalGet))`, which means that no matter what `nock` does or doesn’t do with `originalGet`, our wrapper code will always run. But if `nock` monkeypatches after we do, we end up with `nockWrapper(sentryWrapper(originalGet))`, meaning that if `nock` chooses not to call the function it’s wrapping, our code never runs. Next 12.1 introduces a change in [this PR](vercel/next.js#23261), which causes Next to load the `_app` and `_document` pages as soon as the server starts, in the interest of serving the first requested page more quickly. This causes the order of the monkey patching to change, causing the http tests to fail for nextjs. This patch solves this by forcing `get` and `request` to be wrapped again by Sentry after they are wrapped by `nock`. There are some TODOs that need to be addressed, but merging this patch to unblock CI.
This is an alternative to #23196 without pages pre-warm.
At #23187 @timneutkens disagreed with the need to warm up the pages. I think this issue requires additional discussion, but _app and _document can be warmed up right now.