-
Notifications
You must be signed in to change notification settings - Fork 26.9k
This issue was moved to a discussion.
You can continue the conversation there. Go to discussion →
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Server-Sent Events don't work in Next API routes #9965
Comments
I forgot to mention that this works in Micro routes, as well. I'm trying to eliminate the need for my Micro API by moving everything into Next, but this is a blocker for me. |
You can use a custom server.js to workaround this for now: require('dotenv').config();
const app = require('express')();
const server = require('http').Server(app);
const next = require('next');
const DSN = process.env.DSN || 'postgres://postgres:postgres@localhost/db';
const dev = process.env.NODE_ENV !== 'production';
const nextApp = next({ dev });
const nextHandler = nextApp.getRequestHandler();
nextApp.prepare().then(() => {
app.get('*', (req, res) => {
if (req.url === '/stream') {
res.writeHead(200, {
Connection: 'keep-alive',
'Cache-Control': 'no-cache',
'Content-Type': 'text/event-stream',
});
res.write('data: Processing...\n\n');
setTimeout(() => {
res.write('data: Processing2...\n\n');
}, 10000);
} else {
return nextHandler(req, res);
}
});
require('../websocket/initWebSocketServer')(server, DSN);
const port = 8080;
server.listen(port, err => {
if (err) throw err;
console.log('> Ready on http://localhost:' + port);
});
}); componentDidMount() {
this.source = new EventSource('/stream')
this.source.onmessage = function(e) {
console.log(e)
}
} |
I would still recommend to keep any server sent event and websocket handlers in separate processes in production. It's very likely that the frequency of updates to those parts of the business logic are quite different. Your front-end most likely changes more often than the types of events you handle / need to push to the clients from the servers. If you only make changes to one, you probably don't want to restart the processes responsible for the other(s). Better to keep the connections alive rather than cause a flood of reconnections / server restarts for changes which have no effect. |
@msand The main reason I'm trying to avoid using a custom server is that I'm deploying to Now. Using a custom server would break all of the wonderful serverless functionality I get there. Your second point is fair. What I'm trying to do is create an SSE stream for data that would otherwise be handled with basic polling. The server is already dealing with constant reconnections in that case, so an SSE stream actually results in fewer reconnections. I suppose I could set up a small webserver in the same repo that just uses a separate Now builder. That would allow the processes to remain separate, though it'd still cause all of the SSE connections to abort and reconnect when there are any changes to the project. Even with those points, I can see plenty of scenarios in which it makes sense to be able to run an SSE endpoint from one of Next's API routes. Additionally, in the docs it's specifically stated that...
Since it's specifically stated that |
@trezy It seems the issue is that the middleware adds a gzip encoding which the browser has negotiated using the header:
If you add
then it seems to work: res.writeHead(200, {
Connection: 'keep-alive',
'Content-Encoding': 'none',
'Cache-Control': 'no-cache',
'Content-Type': 'text/event-stream',
}); |
Alternatively, gzip your content |
Oh, that's super interesting! I'll give that a shot and report back. In the meantime, it'd still be nice for this quirk (and any similar ones) to be noted somewhere in the docs. |
Yeah, it's more a consequence of having some helpers, would be nice with a mode which can turn all of it off, and only makes it a plain req res pair |
Actually, this seems to be documented here: https://github.com/expressjs/compression#server-sent-events Have to call res.flush() when you think there's enough data for the compression to work efficiently export default (req, res) => {
res.writeHead(200, {
'Cache-Control': 'no-cache',
'Content-Type': 'text/event-stream',
});
res.write('data: Processing...');
/* https://github.com/expressjs/compression#server-sent-events
Because of the nature of compression this module does not work out of the box with
server-sent events. To compress content, a window of the output needs to be
buffered up in order to get good compression. Typically when using server-sent
events, there are certain block of data that need to reach the client.
You can achieve this by calling res.flush() when you need the data written to
actually make it to the client.
*/
res.flush();
setTimeout(() => {
res.write('data: Processing2...');
res.flush();
}, 1000);
}; |
It then applies gzip compression for you |
I have switched to using a custom express server. That's the only way I
could get it to work. I guess that's cool since I can do more with express.
Before deciding to integrate express, I had tried the things mentioned
above, none worked.
1. Turned off gzip compression by setting the option in next.config.js. The
behavior remained the same. I inspected the headers on the client (using
postman) and confirmed the gzip encoding was removed, but that didn't seem
to fix the problem.
2. Calling res.flush had no effect either. Instead I get a warning in the
console that flush is deprecated and to use flushHeaders instead. But
that's not what I want.
This is a rather strange bug.. 😔
…On Thursday, 9 January 2020, Mikael Sand ***@***.***> wrote:
It then applies gzip compression for you
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#9965>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AHJC5ZDO2YUJFN5JLQPLYGLQ46QBFANCNFSM4KDHWFMA>
.
|
I have been trying to get SSE work in nextjs, but could not get working. With custom server and native node httpServer req,res it works, but with Nextjs 'res', no messages are sent to the client. |
Hey @kavuri. It is possible to integrate a custom Node.js server (e.g using express) with your next.js app. That way, you can still get Server-Side Rendering without these Next.js limitations. See this page of the official documentation for details: https://nextjs.org/docs/advanced-features/custom-server Also, check out how I implemented this in my own app which I mentioned in the comment above yours: https://github.com/uxFeranmi/react-woocommerce/blob/master/server.js |
@uxFeranmi I could use the custom server method as mentioned here https://nextjs.org/docs/advanced-features/custom-server to write messages as res.write(...). But in the Next app, I do not see any messages in my page
index.js
My custom server.js
I am not getting any message in the index page. But if I open the url |
I have created a small test to trigger the event. Basically, just create a file in the project root directory (say just |
I don't think you want to setup the eventsource in the constructor. I think you should put it in You can not start the eventsource on the server-side because then subsequent messages will be sent to the server, not the browser/client. So you have to initialize the eventsource after the component has been rendered in the browser. Do this either with the PS: You probably don't need to import 'eventsource'. |
@uxFeranmi I have moved the event opening code and the corresponding functions to I am on the verge of giving up on Next and moving back to Express and a standalone Web UI server |
In my case, the SSE was triggered by clicking a button on the page. The click event called this function: const authenticate = (email, callback)=> {
const sse = new EventSource(`/api/auth/sign-in?email=${email}`);
sse.addEventListener("message", (e)=> {
console.log('Default message event\n', e);
});
sse.addEventListener("received", (e)=> {
const {type: event, data} = e;
callback({event, data});
console.log(`${event}: ${data}`);
});
sse.addEventListener("mailsent", (e)=> {
const {type: event, data} = e;
callback({event, data});
console.log(`${event}: ${data}`);
});
sse.addEventListener("authenticated", (e)=> {
const {type: event, data} = e;
callback({event, data});
console.log(`${event}: ${data}`);
sse.close();
});
sse.addEventListener("timeout", (e)=> {
const {type: event, data} = e;
callback({event, data});
console.log(`${event}: ${data}`);
sse.close();
});
sse.addEventListener("error", (e)=> {
const {type: event, data} = e;
let customData = '';
// If connection is closed.
// 0 — connecting, 1 — open, 2 — closed
if (sse.readyState === 2) {
console.log('SSE closed', e);
customData = "Connection to server was lost and couldn't be re-established.";
}
// If still connected & it's an unknown error, attempt reconnection.
else if (!data) return console.log('Reconnecting SSE...');
sse.close();
console.log('Closed SSE...');
console.log(`${event}: ${customData || data}`);
callback({event, data: customData || data});
});
};
export default authenticate; This function simply takes in a value to use in the url query parameter (you'll need to remove this since yours is a fixed url), and a callback function. The callback should be I'm sorry if this isn't very helpful. I can't figure out much just from the code snippet you shared, so I'm showing you my own code hoping it'll work for you. |
I could see that my API endpoint is getting triggered by the UI from the networking tab in chrome inspector, but no updates are seen in the browser |
There's a chance this is related to the gzip compression. Try sending a
final message with res.end().
…On Monday, February 10, 2020, Kavuri ***@***.***> wrote:
I could see that my API endpoint is getting triggered by the UI from the
networking tab in chrome inspector, but no updates are seen in the browser
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#9965>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AHJC5ZFEX2CDGAQRX3A3RY3RCFOJHANCNFSM4KDHWFMA>
.
|
Gzip compression can cause all events/messages to be queued up until you
call res.end(), then all messages are sent at once. Check devtools for the
content type in your response headers if it's gzip.
…On Monday, February 10, 2020, Feranmi Akinlade ***@***.***> wrote:
There's a chance this is related to the gzip compression. Try sending a
final message with res.end().
On Monday, February 10, 2020, Kavuri ***@***.***> wrote:
> I could see that my API endpoint is getting triggered by the UI from the
> networking tab in chrome inspector, but no updates are seen in the browser
>
> —
> You are receiving this because you were mentioned.
> Reply to this email directly, view it on GitHub
> <#9965>,
> or unsubscribe
> <https://github.com/notifications/unsubscribe-auth/AHJC5ZFEX2CDGAQRX3A3RY3RCFOJHANCNFSM4KDHWFMA>
> .
>
|
Hi, server-sent events from an API endpoint appear to be working correctly in Next.js itself. Compression shouldn't be affecting the stream as long as you set Note: they will not work in a serverless environment since those environments are typically buffered and don't allow streaming the response from the lambda. Related AWS Lambda docs here. If you want to create a pub-sub system, services like pusher.com are better suited for this and compliment deploying your applications on ZEIT or other serverless environments very well Here's a gif of it working locally without any custom |
Code from above comment, if you want to try import {NextApiRequest, NextApiResponse} from 'next'
export const sleep = (ms) => new Promise(resolve => setTimeout(resolve, ms));
// curl -Nv localhost:3000/api/see
const handler = async (req: NextApiRequest, res: NextApiResponse) => {
res.setHeader('Access-Control-Allow-Origin', '*');
res.setHeader('Content-Type', 'text/event-stream;charset=utf-8');
res.setHeader('Cache-Control', 'no-cache, no-transform');
res.setHeader('X-Accel-Buffering', 'no');
for (let i = 0; i < 5; i++) {
res.write(`data: Hello seq ${i}\n\n`);
await sleep(1000);
}
res.end('done\n');
};
export default handler; |
Thanks for all the effort on this ticket, folx! I wanted to pop in and say I think we can mark it as resolved. Here's a quick TL;DR:
|
Closing per #9965 (comment) |
Some proxies buffers SSE events to compress the stream. The 'no-transform' directive forbidden this bad behavior. More info at: facebook/create-react-app#1633 vercel/next.js#9965 (comment)
Some proxies buffers SSE events to compress the stream, delaying the browser event reception. The `no-transform` `Cache-Control` directive forbids this bad behavior. More info at: facebook/create-react-app#1633 vercel/next.js#9965 (comment)
Some proxies buffers SSE events to compress the stream, delaying the browser event reception. The `no-transform` `Cache-Control` directive forbids this bad behavior. More info at: facebook/create-react-app#1633 vercel/next.js#9965 (comment) (cherry picked from commit 69fb11f)
Thanks! This was causing issues when deployed in OpenShift. |
For those wondering… for Vercel/Now you can find an explanation on why streams are not supported: https://vercel.com/docs/platform/limits#streaming-responses |
This issue has been automatically locked due to no recent activity. If you are running into a similar issue, please create a new issue with the steps to reproduce. Thank you. |
For those stumbling onto this through Google, this is working as of Next.js 13 + Route Handlers: // app/api/route.ts
import { Configuration, OpenAIApi } from 'openai';
export const runtime = 'nodejs';
// This is required to enable streaming
export const dynamic = 'force-dynamic';
export async function GET() {
const configuration = new Configuration({
apiKey: process.env.OPENAI_API_KEY,
});
const openai = new OpenAIApi(configuration);
let responseStream = new TransformStream();
const writer = responseStream.writable.getWriter();
const encoder = new TextEncoder();
writer.write(encoder.encode('Vercel is a platform for....'));
try {
const openaiRes = await openai.createCompletion(
{
model: 'text-davinci-002',
prompt: 'Vercel is a platform for',
max_tokens: 100,
temperature: 0,
stream: true,
},
{ responseType: 'stream' }
);
// @ts-ignore
openaiRes.data.on('data', async (data: Buffer) => {
const lines = data
.toString()
.split('\n')
.filter((line: string) => line.trim() !== '');
for (const line of lines) {
const message = line.replace(/^data: /, '');
if (message === '[DONE]') {
console.log('Stream completed');
writer.close();
return;
}
try {
const parsed = JSON.parse(message);
await writer.write(encoder.encode(`${parsed.choices[0].text}`));
} catch (error) {
console.error('Could not JSON parse stream message', message, error);
}
}
});
} catch (error) {
console.error('An error occurred during OpenAI request', error);
writer.write(encoder.encode('An error occurred during OpenAI request'));
writer.close();
}
return new Response(responseStream.readable, {
headers: {
'Content-Type': 'text/event-stream',
Connection: 'keep-alive',
'Cache-Control': 'no-cache, no-transform',
},
});
} |
I'm going to unlock this because I've been sent it a handful of times, so it must be coming up in Google searches more often. Will transfer to a discussion instead of an issue 👍 |
This issue was moved to a discussion.
You can continue the conversation there. Go to discussion →
Bug report
Describe the bug
When using Next's API routes, chunks that are written with
res.write
aren't sent until afterres.end()
is called.To Reproduce
Steps to reproduce the behavior, please provide code snippets or a repository:
Expected behavior
The route sends a new event to the connection every second.
Actual behavior
The route doesn't send any data to the connection unless a call to
res.end()
is added to the route.System information
Additional context
When using other HTTP frameworks (Express, Koa,
http
, etc) this method works as expected. It's explicitly supported by Node'shttp.incomingMessage
andhttp.ServerResponse
classes which, from what I understand, Next uses as a base for thereq
andres
that are passed into Next API routes.I'd hazard a guess that #5855 was caused by the same issue, but considered unrelated because the issue was obscured by the
express-sse
library.There are also two Spectrum topics about this (here and here) that haven't garnered much attention yet.
Supporting Websockets and SSE in Next API routes may be related, but fixing support for SSE should be a lower barrier than adding support Websockets. All of the inner workings are there, we just need to get the plumbing repaired.
The text was updated successfully, but these errors were encountered: