-
Notifications
You must be signed in to change notification settings - Fork 30.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
benchmark: support for multiple http benchmarkers #8140
Changes from 3 commits
3d14782
6113cec
721a1c5
52521b9
97b334a
687a64f
3a8a9c8
ed3ad0f
7cd3daa
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -14,9 +14,21 @@ This folder contains benchmarks to measure the performance of the Node.js APIs. | |
|
||
## Prerequisites | ||
|
||
Most of the http benchmarks require [`wrk`][wrk] to be installed. It may be | ||
available through your preferred package manager. If not, `wrk` can be built | ||
[from source][wrk] via `make`. | ||
Most of the HTTP benchmarks require a benchmarker to be installed, this can be | ||
either [`wrk`][wrk] or [`autocannon`][autocannon]. | ||
|
||
`Autocannon` is a Node script that can be installed using | ||
`npm install -g autocannon`. It will use the Node executable that is in the | ||
path, hence if you want to compare two HTTP benchmark runs make sure that the | ||
Node version in the path is not altered. | ||
|
||
`wrk` may be available through your preferred package manger. If not, you can | ||
easily build it [from source][wrk] via `make`. | ||
|
||
By default first found benchmark tool will be used to run HTTP benchmarks. You | ||
can overridde this by seting `NODE_HTTP_BENCHMARKER` environment variable to | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I can see it's usefulness, but do we really need it. I would like to avoid adding unnecessary environment flags. This could also be accomplished by using I would appreciate other opinions. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I agree on the I don't expect this to be changed that much anyway, so the env variable is probably ok as well. |
||
the desired benchmarker name. When creating a HTTP benchmark you can also | ||
specify which benchmarker should be used. | ||
|
||
To analyze the results `R` should be installed. Check you package manager or | ||
download it from https://www.r-project.org/. | ||
|
@@ -287,5 +299,59 @@ function main(conf) { | |
} | ||
``` | ||
|
||
## Creating HTTP benchmark | ||
|
||
The `bench` object returned by `createBenchmark` implements | ||
`http(options, callback)` method. It can be used to run external tool to | ||
benchmark HTTP servers. This benchmarks simple HTTP server with all installed | ||
benchmarking tools. | ||
|
||
```js | ||
'use strict'; | ||
|
||
var common = require('../common.js'); | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. should be |
||
|
||
var bench = common.createBenchmark(main, { | ||
kb: [64, 128, 256, 1024], | ||
connections: [100, 500], | ||
benchmarker: common.installed_http_benchmarkers | ||
}); | ||
|
||
function main(conf) { | ||
const http = require('http'); | ||
const len = conf.kb * 1024; | ||
const chunk = Buffer.alloc(len, 'x'); | ||
var server = http.createServer(function(req, res) { | ||
res.end(chunk); | ||
}); | ||
|
||
server.listen(common.PORT, function() { | ||
bench.http({ | ||
connections: conf.connections, | ||
benchmarker: conf.benchmarker | ||
}, function() { | ||
server.close(); | ||
}); | ||
}); | ||
} | ||
``` | ||
|
||
Supported options keys are: | ||
* `port` - defaults to `common.PORT` | ||
* `path` - defaults to `/` | ||
* `connections` - number of concurrent connections to use, defaults to 100 | ||
* `duration` - duration of the benchmark in seconds, defaults to 10 | ||
* `benchmarker` - benchmarker to use, defaults to | ||
`common.default_http_benchmarker` | ||
|
||
The `common.js` module defines 3 handy constants: | ||
* `supported_http_benchmarkers` - array with names of all supported | ||
benchmarkers | ||
* `installed_http_benchmarkers` - array with names of all supported | ||
benchmarkers that are currently installed on this machine | ||
* `default_http_benchmarker` - first element from `installed_http_benchmarkers` | ||
or value of `process.env.NODE_HTTP_BENCHMARKER` if it is set | ||
|
||
[autocannon]: https://github.com/mcollina/autocannon | ||
[wrk]: https://github.com/wg/wrk | ||
[t-test]: https://en.wikipedia.org/wiki/Student%27s_t-test#Equal_or_unequal_sample_sizes.2C_unequal_variances |
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,6 +1,7 @@ | ||
'use strict'; | ||
|
||
const child_process = require('child_process'); | ||
const http_benchmarkers = require('./http-benchmarkers.js'); | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. perhaps it should be called |
||
|
||
// The port used by servers and wrk | ||
exports.PORT = process.env.PORT || 12346; | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Perhaps this should be moved to |
||
|
@@ -88,51 +89,28 @@ Benchmark.prototype._queue = function(options) { | |
return queue; | ||
}; | ||
|
||
function hasWrk() { | ||
const result = child_process.spawnSync('wrk', ['-h']); | ||
if (result.error && result.error.code === 'ENOENT') { | ||
console.error('Couldn\'t locate `wrk` which is needed for running ' + | ||
'benchmarks. Check benchmark/README.md for further instructions.'); | ||
process.exit(1); | ||
} | ||
} | ||
// Benchmark an http server. | ||
exports.default_http_benchmarker = | ||
http_benchmarkers.default_http_benchmarker; | ||
exports.supported_http_benchmarkers = | ||
http_benchmarkers.supported_http_benchmarkers; | ||
exports.installed_http_benchmarkers = | ||
http_benchmarkers.installed_http_benchmarkers; | ||
|
||
// benchmark an http server. | ||
const WRK_REGEXP = /Requests\/sec:[ \t]+([0-9\.]+)/; | ||
Benchmark.prototype.http = function(urlPath, args, cb) { | ||
hasWrk(); | ||
Benchmark.prototype.http = function(options, cb) { | ||
const self = this; | ||
if (!options.port) { | ||
options.port = exports.PORT; | ||
} | ||
|
||
const urlFull = 'http://127.0.0.1:' + exports.PORT + urlPath; | ||
args = args.concat(urlFull); | ||
|
||
const childStart = process.hrtime(); | ||
const child = child_process.spawn('wrk', args); | ||
child.stderr.pipe(process.stderr); | ||
|
||
// Collect stdout | ||
let stdout = ''; | ||
child.stdout.on('data', (chunk) => stdout += chunk.toString()); | ||
|
||
child.once('close', function(code) { | ||
const elapsed = process.hrtime(childStart); | ||
if (cb) cb(code); | ||
|
||
if (code) { | ||
console.error('wrk failed with ' + code); | ||
process.exit(code); | ||
http_benchmarkers.run(options, function(benchmarker_name, result, elapsed) { | ||
if (!self.config.benchmarker) { | ||
self.config.benchmarker = benchmarker_name; | ||
} | ||
|
||
// Extract requests pr second and check for odd results | ||
const match = stdout.match(WRK_REGEXP); | ||
if (!match || match.length <= 1) { | ||
console.error('wrk produced strange output:'); | ||
console.error(stdout); | ||
process.exit(1); | ||
self.report(result, elapsed); | ||
if (cb) { | ||
cb(0); | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This is a little odd. I think you should make the
that way you can check the error and call the callback appropriately. Actually I don't care so much if the behaviour is the same. It is just that we should avoid calling |
||
} | ||
|
||
// Report rate | ||
self.report(+match[1], elapsed); | ||
}); | ||
}; | ||
|
||
|
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,168 @@ | ||
'use strict'; | ||
|
||
const child_process = require('child_process'); | ||
|
||
function AutocannonBenchmarker() { | ||
this.name = 'autocannon'; | ||
|
||
const autocannon_exe = process.platform === 'win32' | ||
? 'autocannon.cmd' | ||
: 'autocannon'; | ||
this.present = function() { | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. you should move these to the prototype, that is how all the other classes works. |
||
var result = child_process.spawnSync(autocannon_exe, ['-h']); | ||
return !(result.error && result.error.code === 'ENOENT'); | ||
}; | ||
|
||
this.create = function(port, path, duration, connections) { | ||
const args = ['-d', duration, '-c', connections, '-j', '-n', | ||
`http://127.0.0.1:${port}${path}` ]; | ||
var child = child_process.spawn(autocannon_exe, args); | ||
child.stdout.setEncoding('utf8'); | ||
return child; | ||
}; | ||
|
||
this.processResults = function(output) { | ||
let result; | ||
try { | ||
result = JSON.parse(output); | ||
} catch (err) { | ||
// Do nothing, let next line handle this | ||
} | ||
if (!result || !result.requests || !result.requests.average) { | ||
return undefined; | ||
} else { | ||
return result.requests.average; | ||
} | ||
}; | ||
} | ||
|
||
function WrkBenchmarker() { | ||
this.name = 'wrk'; | ||
|
||
this.present = function() { | ||
var result = child_process.spawnSync('wrk', ['-h']); | ||
return !(result.error && result.error.code === 'ENOENT'); | ||
}; | ||
|
||
this.create = function(port, path, duration, connections) { | ||
const args = ['-d', duration, '-c', connections, '-t', 8, | ||
`http://127.0.0.1:${port}${path}` ]; | ||
var child = child_process.spawn('wrk', args); | ||
child.stdout.setEncoding('utf8'); | ||
child.stderr.pipe(process.stderr); | ||
return child; | ||
}; | ||
|
||
const regexp = /Requests\/sec:[ \t]+([0-9\.]+)/; | ||
this.processResults = function(output) { | ||
const match = output.match(regexp); | ||
const result = match && +match[1]; | ||
if (!result) { | ||
return undefined; | ||
} else { | ||
return result; | ||
} | ||
}; | ||
} | ||
|
||
const http_benchmarkers = [ new AutocannonBenchmarker(), | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I would like to see |
||
new WrkBenchmarker() ]; | ||
|
||
var default_http_benchmarker; | ||
var supported_http_benchmarkers = []; | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Use |
||
var installed_http_benchmarkers = []; | ||
var benchmarkers = {}; | ||
|
||
http_benchmarkers.forEach((benchmarker) => { | ||
const name = benchmarker.name; | ||
const present = benchmarker.present(); | ||
benchmarkers[name] = { | ||
benchmarker: benchmarker, | ||
present: present, | ||
default: false | ||
}; | ||
|
||
supported_http_benchmarkers.push(name); | ||
if (present) { | ||
if (!default_http_benchmarker) { | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I think this is easier to read if moved out of the if (process.env.NODE_HTTP_BENCHMARKER) {
default_http_benchmarker = installed_http_benchmarkers[
process.env.NODE_HTTP_BENCHMARKER
];
} else {
default_http_benchmarker = installed_http_benchmarkers[
Object.keys(installed_http_benchmarkers)[0]
];
}
if (default_http_benchmarker) {
default_http_benchmarker.default = true;
} |
||
default_http_benchmarker = name; | ||
benchmarkers[name].default = true; | ||
} | ||
installed_http_benchmarkers.push(name); | ||
} | ||
}); | ||
|
||
function getBenchmarker(name) { | ||
const benchmarker = benchmarkers[name]; | ||
if (!benchmarker) { | ||
throw new Error(`benchmarker '${name}' is not supported`); | ||
} | ||
if (!benchmarker.present) { | ||
throw new Error(`benchmarker '${name}' is not installed`); | ||
} | ||
return benchmarker.benchmarker; | ||
} | ||
|
||
if (process.env.NODE_HTTP_BENCHMARKER) { | ||
const requested = process.env.NODE_HTTP_BENCHMARKER; | ||
try { | ||
default_http_benchmarker = requested; | ||
getBenchmarker(requested); | ||
} catch (err) { | ||
console.error('Error when overriding default http benchmarker: ' + | ||
err.message); | ||
process.exit(1); | ||
} | ||
} | ||
|
||
exports.run = function(options, callback) { | ||
options = Object.assign({ | ||
port: 1234, | ||
path: '/', | ||
connections: 100, | ||
duration: 10, | ||
benchmarker: default_http_benchmarker | ||
}, options); | ||
if (!options.benchmarker) { | ||
console.error('Could not locate any of the required http benchmarkers' + | ||
'Check benchmark/README.md for further instructions.'); | ||
process.exit(1); | ||
} | ||
const benchmarker = getBenchmarker(options.benchmarker); | ||
|
||
const benchmarker_start = process.hrtime(); | ||
|
||
var child = benchmarker.create(options.port, options.path, options.duration, | ||
options.connections); | ||
|
||
let stdout = ''; | ||
child.stdout.on('data', (chunk) => stdout += chunk.toString()); | ||
|
||
child.once('close', function(code) { | ||
const elapsed = process.hrtime(benchmarker_start); | ||
if (code) { | ||
if (stdout === '') { | ||
console.error(`${options.benchmarker} failed with ${code}`); | ||
} else { | ||
console.error(`${options.benchmarker} failed with ${code}. Output:`); | ||
console.error(stdout); | ||
} | ||
process.exit(1); | ||
} | ||
|
||
var result = benchmarker.processResults(stdout); | ||
if (!result) { | ||
console.error(`${options.benchmarker} produced strange output`); | ||
console.error(stdout); | ||
process.exit(1); | ||
} | ||
|
||
callback(options.benchmarker, result, elapsed); | ||
}); | ||
|
||
}; | ||
|
||
exports.default_http_benchmarker = default_http_benchmarker; | ||
exports.supported_http_benchmarkers = supported_http_benchmarkers; | ||
exports.installed_http_benchmarkers = installed_http_benchmarkers; | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You should clarify and simply say that
wrk
will be used if it exists.