You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If our node service restarts while users are connected it appears that our service cannot handle the load of all clients sockets reconnecting when the service comes back online. We have a throttler in place but it doesn't appear to come into play for web sockets, it does throttle REST and asset requests fine.
I've done a ton of searching but can not find where others have experienced this. I discovered this morning that socket.io will buffer events on the client when disconnected and send when the connection is re-established. Clearing the sendBuffer will help with sending traffic to the node service but I don't think that will fix the issue. Others must have experienced this and have a means of preventing/controlling it.
The text was updated successfully, but these errors were encountered:
Not sure if you resolved this or not, but one thing you could do is limit requests to /socket.io and return a 503 Service Unavailable if that URL gets slammed (we use haproxy to do this). The socket.io client will continue to try reconnecting indefinitely at a certain back off rate (unless you've changed that of course, see the reconnection settings). The client will limit reconnection attempts automatically if it gets a 503 or some other error.
You should then have an initial burst of clients reconnecting, then a trickle as the threshold goes below the limit. If you have both the polling and websocket transports enabled, this should work w/o having to limit websocket reconnects in any way. socket.io will start the reconnect with polling and then try to upgrade to websocket after polling succeeds.
As pointed out by @brenc, I think your best bet is to update the reconnectionDelay / reconnectionDelayMax options, so that the clients do not try to reconnect simultaneously.
If our node service restarts while users are connected it appears that our service cannot handle the load of all clients sockets reconnecting when the service comes back online. We have a throttler in place but it doesn't appear to come into play for web sockets, it does throttle REST and asset requests fine.
I've done a ton of searching but can not find where others have experienced this. I discovered this morning that socket.io will buffer events on the client when disconnected and send when the connection is re-established. Clearing the sendBuffer will help with sending traffic to the node service but I don't think that will fix the issue. Others must have experienced this and have a means of preventing/controlling it.
The text was updated successfully, but these errors were encountered: