Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement Streaming Static/Proxy Rendering #4329

Merged
merged 7 commits into from
Mar 29, 2023
Merged

Conversation

jridgewell
Copy link
Contributor

@jridgewell jridgewell commented Mar 24, 2023

Description

Implements streaming rendering for static/proxy renders (I don't know why they're called static or proxy…) using an initial Headers message, N BodyChunk messages, and a final BodyEnd message.

Updating proxy rendering was very easy. It already implemented separate messages for header and body, but it buffered the full body before sending the full contents. But static rendering was a little more difficult, it has multiple paths that are immediately ready with a full body that I didn't want to break apart. So I kept the Response message for a full response that transforms into a ContentSourceContent::Static, and implemented a new enum type for streaming responses into a ContentSourceContent::HttpProxy.

These is some weirdness with different data types on the headers field of a ContentSourceContent::Static's StaticContent and a ContentSourceContent::HttpProxy's ProxyResultVc. One is a HeaderListVc and the other is a Vec<(String, String)> (and really should be a HeadersList), but I'm leaving this for a follow up.

Testing Instructions

Paired Next.js PR: vercel/next.js#47476

Re: WEB-27

@vercel
Copy link

vercel bot commented Mar 24, 2023

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated
examples-native-web 🔄 Building (Inspect) Mar 29, 2023 at 4:54PM (UTC)
examples-tailwind-web 🔄 Building (Inspect) Mar 29, 2023 at 4:54PM (UTC)
9 Ignored Deployments
Name Status Preview Comments Updated
examples-basic-web ⬜️ Ignored (Inspect) Mar 29, 2023 at 4:54PM (UTC)
examples-cra-web ⬜️ Ignored (Inspect) Mar 29, 2023 at 4:54PM (UTC)
examples-designsystem-docs ⬜️ Ignored (Inspect) Mar 29, 2023 at 4:54PM (UTC)
examples-gatsby-web ⬜️ Ignored (Inspect) Mar 29, 2023 at 4:54PM (UTC)
examples-kitchensink-blog ⬜️ Ignored (Inspect) Mar 29, 2023 at 4:54PM (UTC)
examples-nonmonorepo ⬜️ Ignored (Inspect) Mar 29, 2023 at 4:54PM (UTC)
examples-svelte-web ⬜️ Ignored (Inspect) Mar 29, 2023 at 4:54PM (UTC)
examples-vite-web ⬜️ Ignored (Inspect) Mar 29, 2023 at 4:54PM (UTC)
turbo-site ⬜️ Ignored (Inspect) Visit Preview Mar 29, 2023 at 4:54PM (UTC)

@github-actions
Copy link
Contributor

github-actions bot commented Mar 24, 2023

🟢 CI successful 🟢

Thanks

crates/turbopack-node/src/render/mod.rs Outdated Show resolved Hide resolved
crates/turbopack-node/src/render/mod.rs Outdated Show resolved Hide resolved
@github-actions
Copy link
Contributor

✅ This changes can build next-swc

Copy link
Contributor Author

@jridgewell jridgewell left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, this is now fully updated. I really dislike the amount of duplication that this requires for anything that streams. Because of the different response types (proxy always streams a response, but static can return a rewrite, full response, or stream response) I can't share code between the two.

Streaming itself is cumbersome. The incoming stream needs defined types, and the outgoing stream needs a new set of defined types with "prepared" values. This all happens in a pulling function, which is 2 steps removed from the actual place I want to use the stream in the main function. In the main function, I can only read the prepared values and return them. This might be simpler if we can macro over the boilerplate.

Oh, and the generator! {} block doesn't format, so that's ugly.

@jridgewell jridgewell requested a review from sokra March 29, 2023 03:58
Copy link
Member

@sokra sokra left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Look good in general. Reviewed from mobile. Looking forward to see this in action

crates/turbopack-node/src/render/render_proxy.rs Outdated Show resolved Hide resolved
@@ -246,29 +266,29 @@ pub struct ContentSourceData {
pub cache_buster: u64,
}

type Chunk = Result<Bytes, BodyError>;
pub type BodyChunk = Result<Bytes, BodyError>;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
pub type BodyChunk = Result<Bytes, BodyError>;
pub type BodyChunk = Result<Bytes, SharedError>;

BodyError coverts everything to a string loosing all the structure. We want to avoid that.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This requires a slightly larger change, and it's complicated by ShareError not being serializable. That would prevent any body response from being cachable, which I think is bad for our end goal? Anyways, I'd like to defer this to a follow up PR for the time being.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That would prevent any body response from being cachable, which I think is bad for our end goal?

It would still be cacheable in memory.

While I like to have static rendering to be cacheable, next.js doesn't cache rendering at all in dev and always server-renders a fresh page.

Ok(v) => Err(BodyError::new(format!("unexpected render item: {:#?}", v))),
Err(e) => Err(BodyError::new(format!(
"error streaming proxied contents: {}",
e
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Use .context() instead of format!
We want to avoid converting to string

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree, but I'm deferring to a follow up PR.


// We create a new cell in this task, which will be updated from the
// [render_stream_internal] task.
let cell = turbo_tasks::macro_helpers::find_cell_by_type(*RENDERSTREAM_VALUE_TYPE_ID);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We probably should move these hacks into turbo-tasks-bytes

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I actually think we should move Stream<T> into turbo-tasks proper, and design a macro interface as you suggested.

crates/turbopack-node/src/render/render_static.rs Outdated Show resolved Hide resolved
crates/turbopack-node/src/render/render_static.rs Outdated Show resolved Hide resolved
@jridgewell jridgewell requested a review from sokra March 29, 2023 21:50
Copy link
Member

@sokra sokra left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Approving that, but I need some cleanup in follow-up PRs. Also make sure to document public methods

@jridgewell jridgewell merged commit ffb7e70 into main Mar 29, 2023
@jridgewell jridgewell deleted the jrl-rsc-streaming branch March 29, 2023 22:24
sokra added a commit to vercel/next.js that referenced this pull request Mar 31, 2023
Paired with vercel/turborepo#4329, this implements
streaming responses for App and API renders. This is accomplished by
sending an initial `headers` message (carrying the status code and
headers list), N `body-chunk` messages of bytes, and a final `body-end`
message to signal completion.

Once sent to Turbopack, these chunk messages will be streamed out of the
node rendering process directly into the HTTP server's response.

Closes WEB-27

---------

Co-authored-by: Tobias Koppers <tobias.koppers@googlemail.com>
Co-authored-by: kodiakhq[bot] <49736102+kodiakhq[bot]@users.noreply.github.com>
jridgewell added a commit that referenced this pull request Mar 31, 2023
Follow up to #4329, this removes
`BodyError` (which was just a simple `String` and not an `Error`) and
switches to `SharedError`.

I've implemented a basic `PartialEq` and `Serialization` for
`SharedError` so that it doesn't infect everything that uses `Body`.
ForsakenHarmony pushed a commit to vercel/next.js that referenced this pull request Jul 25, 2024
### Description

Implements streaming rendering for static/proxy renders (I don't know
why they're called static or proxy…) using an initial `Headers` message,
N `BodyChunk` messages, and a final `BodyEnd` message.

Updating proxy rendering was very easy. It already implemented separate
messages for header and body, but it buffered the full body before
sending the full contents. But static rendering was a little more
difficult, it has multiple paths that are immediately ready with a full
body that I didn't want to break apart. So I kept the `Response` message
for a full response that transforms into a
`ContentSourceContent::Static`, and implemented a new enum type for
streaming responses into a `ContentSourceContent::HttpProxy`.

These is some weirdness with different data types on the `headers` field
of a `ContentSourceContent::Static`'s `StaticContent` and a
`ContentSourceContent::HttpProxy`'s `ProxyResultVc`. One is a
`HeaderListVc` and the other is a `Vec<(String, String)>` (and really
should be a `HeadersList`), but I'm leaving this for a follow up.


### Testing Instructions

Paired Next.js PR: #47476

Re: WEB-27
ForsakenHarmony pushed a commit to vercel/next.js that referenced this pull request Jul 25, 2024
Follow up to vercel/turborepo#4329, this removes
`BodyError` (which was just a simple `String` and not an `Error`) and
switches to `SharedError`.

I've implemented a basic `PartialEq` and `Serialization` for
`SharedError` so that it doesn't infect everything that uses `Body`.
ForsakenHarmony pushed a commit to vercel/next.js that referenced this pull request Jul 29, 2024
### Description

Implements streaming rendering for static/proxy renders (I don't know
why they're called static or proxy…) using an initial `Headers` message,
N `BodyChunk` messages, and a final `BodyEnd` message.

Updating proxy rendering was very easy. It already implemented separate
messages for header and body, but it buffered the full body before
sending the full contents. But static rendering was a little more
difficult, it has multiple paths that are immediately ready with a full
body that I didn't want to break apart. So I kept the `Response` message
for a full response that transforms into a
`ContentSourceContent::Static`, and implemented a new enum type for
streaming responses into a `ContentSourceContent::HttpProxy`.

These is some weirdness with different data types on the `headers` field
of a `ContentSourceContent::Static`'s `StaticContent` and a
`ContentSourceContent::HttpProxy`'s `ProxyResultVc`. One is a
`HeaderListVc` and the other is a `Vec<(String, String)>` (and really
should be a `HeadersList`), but I'm leaving this for a follow up.


### Testing Instructions

Paired Next.js PR: #47476

Re: WEB-27
ForsakenHarmony pushed a commit to vercel/next.js that referenced this pull request Jul 29, 2024
Follow up to vercel/turborepo#4329, this removes
`BodyError` (which was just a simple `String` and not an `Error`) and
switches to `SharedError`.

I've implemented a basic `PartialEq` and `Serialization` for
`SharedError` so that it doesn't infect everything that uses `Body`.
ForsakenHarmony pushed a commit to vercel/next.js that referenced this pull request Aug 1, 2024
### Description

Implements streaming rendering for static/proxy renders (I don't know
why they're called static or proxy…) using an initial `Headers` message,
N `BodyChunk` messages, and a final `BodyEnd` message.

Updating proxy rendering was very easy. It already implemented separate
messages for header and body, but it buffered the full body before
sending the full contents. But static rendering was a little more
difficult, it has multiple paths that are immediately ready with a full
body that I didn't want to break apart. So I kept the `Response` message
for a full response that transforms into a
`ContentSourceContent::Static`, and implemented a new enum type for
streaming responses into a `ContentSourceContent::HttpProxy`.

These is some weirdness with different data types on the `headers` field
of a `ContentSourceContent::Static`'s `StaticContent` and a
`ContentSourceContent::HttpProxy`'s `ProxyResultVc`. One is a
`HeaderListVc` and the other is a `Vec<(String, String)>` (and really
should be a `HeadersList`), but I'm leaving this for a follow up.


### Testing Instructions

Paired Next.js PR: #47476

Re: WEB-27
ForsakenHarmony pushed a commit to vercel/next.js that referenced this pull request Aug 1, 2024
Follow up to vercel/turborepo#4329, this removes
`BodyError` (which was just a simple `String` and not an `Error`) and
switches to `SharedError`.

I've implemented a basic `PartialEq` and `Serialization` for
`SharedError` so that it doesn't infect everything that uses `Body`.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants