Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is it any alternative to run deno http server in cluster mode? #5965

Closed
devalexqt opened this issue May 30, 2020 · 14 comments
Closed

Is it any alternative to run deno http server in cluster mode? #5965

devalexqt opened this issue May 30, 2020 · 14 comments
Labels

Comments

@devalexqt
Copy link

Is it possible to run deno http server on all available cpu?

@ry
Copy link
Member

ry commented May 30, 2020

No, we don't have that functionality, yet, unfortunately.

@alxtsg
Copy link

alxtsg commented May 30, 2020

Does using Worker count? Here is a prototype:

app.ts:

const START_PORT = 8000;
// Adjust to the number of CPU cores.
const WORKER_COUNT = 4;
for (let i = 0; i < WORKER_COUNT; i += 1) {
  const worker = new Worker(
    "./server_worker.ts",
    {
      type: "module",
      deno: true,
    },
  );
  worker.postMessage({
    port: START_PORT + i,
  });
}

server_worker.ts:

import { serve } from "https://deno.land/std@0.54.0/http/server.ts";

self.onmessage = async (e) => {
  const { port } = e.data;
  const server = serve({
    hostname: "127.0.0.1",
    port,
  });
  console.log(`Started an HTTP server listening on port ${port}.`);
  for await (const req of server) {
    req.respond({ body: "Hello World\n" });
  }
};

Start the server:

# Running Deno in worker is an unstable feature, so we need to add the
# --unstable flag.
deno run --allow-read --allow-net --unstable app.ts

In the example above, 4 HTTP servers will be started.

Started new HTTP server listening on port 8001.
Started new HTTP server listening on port 8000.
Started new HTTP server listening on port 8002.
Started new HTTP server listening on port 8003.

You can put a load balancer in front of the application and distribute the requests to each of the HTTP servers.

This is not the same as using cluster in Node.js to start multiple HTTP servers, because the HTTP servers in the Workers cannot listen on the same port.

@devalexqt
Copy link
Author

Thanks for reply, but do You guys planing add this features or its out of focus for now?

@devalexqt
Copy link
Author

"same port" is a key

@alxtsg
Copy link

alxtsg commented May 30, 2020

See #3403 and #5377, looks like not much progress on that area yet.

@ry
Copy link
Member

ry commented May 30, 2020

@devalexqt We don’t have plans to add that feature - but I’d be open to discussing how it could be done with web workers. Basically we need the ability to send resources to other workers - then you would share the server resource...

@devalexqt
Copy link
Author

Yea, resource sharing with workers will be elegant solution!

@ngot
Copy link
Contributor

ngot commented May 31, 2020

I had done a similar cluster module using the worker feature in another JS runtime called fibjs. Transferring an object's ownership to another isolation(or call it worker) is the key point. (fibjs only supports transfer network socket and Buffer, tbh, socket is enough)

This is the module I wrote https://github.com/fibjs-modules/cluster-server#usage, the key implementation is https://github.com/fibjs-modules/cluster-server/blob/master/lib/cluster.js#L92

According to

Anyway, if it is possible to support transfer Deno.Conn(or something smiliar) between workers, it will be feasible to implement cluster.

@stale
Copy link

stale bot commented Jan 6, 2021

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the stale label Jan 6, 2021
@stale stale bot closed this as completed Jan 13, 2021
@kedicesur
Copy link

kedicesur commented Feb 14, 2021

@ry What @alxtsg suggested may work by starting a load balancer ahead. Such as gobetween which is very simple. I just tried it with 4 sessions of denotrain working simultaneously on localhost at 4 different ports and req/sec increased from ~17000 req/sec to ~55,000 req/sec when tested with wrk.

@abaldawa
Copy link

@alxtsg What's the point of starting HTTP servers within workers if you are gonna put a manual load balancer in the front? Just start 4 separate Deno HTTP server processes and manually put load balancer on the front then for even better performance. The whole point of cluster module was they do port sharing and load balancing for you. If Deno does not have that then there is no point in using workers to start listening on separate port when workers are not as efficient in IO as main thread.

@alxtsg
Copy link

alxtsg commented Jan 31, 2024

@abaldawa It depends on what you want to do and the requirements. Maybe in some cases you can't put a load balancer in front of the application for whatever reasons.

@matthewharwood
Copy link

matthewharwood commented Feb 15, 2024

Cluster allows to vertical scale compute e.g 1 instance with many cores. We've been doing significant work on node and found that vertical scaling an instance with 6 core and say 4 horizontal instances with load balancing has significant improvements in RPS and latency than 1 core with 24 instances and load balancing. The trade off is the memory increases but it's worth it

@devalexqt
Copy link
Author

I agree what in production we need to have load balancer in front of actual service (for our projects we use nginx ) and then just spawn local js servers on different ports as many as we need. Also, all static assets, uploads and downloads served by nginx side because it's already have good builtin workers and we don't hit our js server for each request to just send staic files... As result js server only respond to API requests .

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

7 participants