-
Notifications
You must be signed in to change notification settings - Fork 42
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Pooling support for high request rates? #49
Comments
Good question, there's technically nothing wrong with multiple Node.js processes. A pool of INodeJSService instances would work fine. My concern was more with users repeatedly creating new instances, the ReadMe could be better worded. What are your thoughts on Node.js workers for parallelization of CPU intensive tasks? It's not out-of-the-box, but it'll have a much smaller memory footprint than multi-processing.
The static API is thread safe except for
Basically, calling |
Thanks, I'll investigate Node.js workers and let you know what I find out. I was going off research done by other projects, which seem to prefer separate processes (for isolation and redundancy), like here: https://medium.com/airbnb-engineering/operationalizing-node-js-for-server-side-rendering-c5ba718acfc9 Also ReactJS.NET does a similar model of pooling JS Engines (non-NodeJS, unfortunately) https://github.com/reactjs/React.NET/blob/3a8573a439be231cdd4cd702a6f4f425791b6e25/src/React.Core/JavaScriptEngineFactory.cs#L84 |
I've decided to add out-of-the-box parallelization #52 (work in progress). When it's done you'll just need to set I think you may be right about separate processes being preferable - it seems there can be significant overhead passing data between threads using workers. We'll see once both have been implemented. Initial benchmarks for multi-processing are excellent: [Benchmark]
public async Task<DummyResult[]> INodeJSService_InvokeCpuBound_Multithreaded()
{
const string dummyModule = @"module.exports = (callback, resultString) => {
// Block CPU
var end = new Date().getTime() + 1000;
while(new Date().getTime() < end){ /* do nothing */ }
callback(null, {result: resultString});
}";
const string dummyIdentifier = "dummyIdentifier";
const string dummyResultString = "success";
// Act
const int numTasks = 5;
var results = new Task<DummyResult>[numTasks];
for (int i = 0; i < 5; i++)
{
results[i] = _nodeJSService.InvokeFromStringAsync<DummyResult>(dummyModule, dummyIdentifier, args: new[] { dummyResultString });
}
return await Task.WhenAll(results);
}
|
@JeremyTCD i tried concurency mode now and TryInvokeFromCacheAsync always false, i call that method about 1000 times and its return false always :( |
Ok, I see problem ) |
@DaniilSokolyuk Oh yeah good catch, at present you end up caching in the "next" Node.js process in the pool. I like the factory suggestion, would mean users don't have to test for If module isn't cached in current Node.js process, factory is invoked and result is cached:
At the same time I'm also thinking of adding overloads so args can be a
Thoughts on these? Edit: Removed redundant overloads I suggested. |
@DaniilSokolyuk Fix underway here: #57. Let me know if you have any input on the expanded API. |
Would there be problems having multiple instances? Example use: A high traffic site that needs to do lots of vue/react SSR. A single nodejs process could become CPU-limited.
The readme says to avoid it, but are there any other consequences?
For the static API, does that mean we need to implement our own concurrency control on top of it?
There's an older issue in the aspnet repo where someone put up some sample pooling code, for reference:
aspnet/JavaScriptServices#1442 (comment)
I think it would be so valuable to the .net community to have an out-of-the-box pooled nodejs service.
The text was updated successfully, but these errors were encountered: