Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

benchmark: support for multiple http benchmarkers #8140

Closed
Closed
Show file tree
Hide file tree
Changes from 3 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
9 changes: 1 addition & 8 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -620,13 +620,6 @@ ifeq ($(XZ), 0)
ssh $(STAGINGSERVER) "touch nodejs/$(DISTTYPEDIR)/$(FULLVERSION)/node-$(FULLVERSION)-$(OSTYPE)-$(ARCH).tar.xz.done"
endif

haswrk=$(shell which wrk > /dev/null 2>&1; echo $$?)
wrk:
ifneq ($(haswrk), 0)
@echo "please install wrk before proceeding. More information can be found in benchmark/README.md." >&2
@exit 1
endif

bench-net: all
@$(NODE) benchmark/run.js net

Expand All @@ -636,7 +629,7 @@ bench-crypto: all
bench-tls: all
@$(NODE) benchmark/run.js tls

bench-http: wrk all
bench-http: all
@$(NODE) benchmark/run.js http

bench-fs: all
Expand Down
72 changes: 69 additions & 3 deletions benchmark/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,9 +14,21 @@ This folder contains benchmarks to measure the performance of the Node.js APIs.

## Prerequisites

Most of the http benchmarks require [`wrk`][wrk] to be installed. It may be
available through your preferred package manager. If not, `wrk` can be built
[from source][wrk] via `make`.
Most of the HTTP benchmarks require a benchmarker to be installed, this can be
either [`wrk`][wrk] or [`autocannon`][autocannon].

`Autocannon` is a Node script that can be installed using
`npm install -g autocannon`. It will use the Node executable that is in the
path, hence if you want to compare two HTTP benchmark runs make sure that the
Node version in the path is not altered.

`wrk` may be available through your preferred package manger. If not, you can
easily build it [from source][wrk] via `make`.

By default first found benchmark tool will be used to run HTTP benchmarks. You
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You should clarify and simply say that wrk will be used if it exists.

can overridde this by seting `NODE_HTTP_BENCHMARKER` environment variable to
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can see it's usefulness, but do we really need it. I would like to avoid adding unnecessary environment flags. This could also be accomplished by using --set benchmarker=autocannon, which wouldn't require any extra code.

I would appreciate other opinions.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree on the --set behavior, as it is already there.

I don't expect this to be changed that much anyway, so the env variable is probably ok as well.

the desired benchmarker name. When creating a HTTP benchmark you can also
specify which benchmarker should be used.

To analyze the results `R` should be installed. Check you package manager or
download it from https://www.r-project.org/.
Expand Down Expand Up @@ -287,5 +299,59 @@ function main(conf) {
}
```

## Creating HTTP benchmark

The `bench` object returned by `createBenchmark` implements
`http(options, callback)` method. It can be used to run external tool to
benchmark HTTP servers. This benchmarks simple HTTP server with all installed
benchmarking tools.

```js
'use strict';

var common = require('../common.js');
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should be const.


var bench = common.createBenchmark(main, {
kb: [64, 128, 256, 1024],
connections: [100, 500],
benchmarker: common.installed_http_benchmarkers
});

function main(conf) {
const http = require('http');
const len = conf.kb * 1024;
const chunk = Buffer.alloc(len, 'x');
var server = http.createServer(function(req, res) {
res.end(chunk);
});

server.listen(common.PORT, function() {
bench.http({
connections: conf.connections,
benchmarker: conf.benchmarker
}, function() {
server.close();
});
});
}
```

Supported options keys are:
* `port` - defaults to `common.PORT`
* `path` - defaults to `/`
* `connections` - number of concurrent connections to use, defaults to 100
* `duration` - duration of the benchmark in seconds, defaults to 10
* `benchmarker` - benchmarker to use, defaults to
`common.default_http_benchmarker`

The `common.js` module defines 3 handy constants:
* `supported_http_benchmarkers` - array with names of all supported
benchmarkers
* `installed_http_benchmarkers` - array with names of all supported
benchmarkers that are currently installed on this machine
* `default_http_benchmarker` - first element from `installed_http_benchmarkers`
or value of `process.env.NODE_HTTP_BENCHMARKER` if it is set

[autocannon]: https://github.com/mcollina/autocannon
[wrk]: https://github.com/wg/wrk
[t-test]: https://en.wikipedia.org/wiki/Student%27s_t-test#Equal_or_unequal_sample_sizes.2C_unequal_variances
58 changes: 18 additions & 40 deletions benchmark/common.js
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
'use strict';

const child_process = require('child_process');
const http_benchmarkers = require('./http-benchmarkers.js');
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

perhaps it should be called _http-benchmarkers.js, that is how the other utility files are named.


// The port used by servers and wrk
exports.PORT = process.env.PORT || 12346;
Copy link
Member

@AndreasMadsen AndreasMadsen Aug 23, 2016

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perhaps this should be moved to _http-benchmarkers.js.

Expand Down Expand Up @@ -88,51 +89,28 @@ Benchmark.prototype._queue = function(options) {
return queue;
};

function hasWrk() {
const result = child_process.spawnSync('wrk', ['-h']);
if (result.error && result.error.code === 'ENOENT') {
console.error('Couldn\'t locate `wrk` which is needed for running ' +
'benchmarks. Check benchmark/README.md for further instructions.');
process.exit(1);
}
}
// Benchmark an http server.
exports.default_http_benchmarker =
http_benchmarkers.default_http_benchmarker;
exports.supported_http_benchmarkers =
http_benchmarkers.supported_http_benchmarkers;
exports.installed_http_benchmarkers =
http_benchmarkers.installed_http_benchmarkers;

// benchmark an http server.
const WRK_REGEXP = /Requests\/sec:[ \t]+([0-9\.]+)/;
Benchmark.prototype.http = function(urlPath, args, cb) {
hasWrk();
Benchmark.prototype.http = function(options, cb) {
const self = this;
if (!options.port) {
options.port = exports.PORT;
}

const urlFull = 'http://127.0.0.1:' + exports.PORT + urlPath;
args = args.concat(urlFull);

const childStart = process.hrtime();
const child = child_process.spawn('wrk', args);
child.stderr.pipe(process.stderr);

// Collect stdout
let stdout = '';
child.stdout.on('data', (chunk) => stdout += chunk.toString());

child.once('close', function(code) {
const elapsed = process.hrtime(childStart);
if (cb) cb(code);

if (code) {
console.error('wrk failed with ' + code);
process.exit(code);
http_benchmarkers.run(options, function(benchmarker_name, result, elapsed) {
if (!self.config.benchmarker) {
self.config.benchmarker = benchmarker_name;
}

// Extract requests pr second and check for odd results
const match = stdout.match(WRK_REGEXP);
if (!match || match.length <= 1) {
console.error('wrk produced strange output:');
console.error(stdout);
process.exit(1);
self.report(result, elapsed);
if (cb) {
cb(0);
Copy link
Member

@AndreasMadsen AndreasMadsen Aug 23, 2016

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a little odd. I think you should make the .run signature

http_benchmarkers.run(options, function(error, results) { ... })

that way you can check the error and call the callback appropriately. Actually I don't care so much if the behaviour is the same. It is just that we should avoid calling process.exit() from more than one file, as that makes the program difficult to reason about. This way the process.exit() logic can be in common.js.

}

// Report rate
self.report(+match[1], elapsed);
});
};

Expand Down
168 changes: 168 additions & 0 deletions benchmark/http-benchmarkers.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,168 @@
'use strict';

const child_process = require('child_process');

function AutocannonBenchmarker() {
this.name = 'autocannon';

const autocannon_exe = process.platform === 'win32'
? 'autocannon.cmd'
: 'autocannon';
this.present = function() {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you should move these to the prototype, that is how all the other classes works.

var result = child_process.spawnSync(autocannon_exe, ['-h']);
return !(result.error && result.error.code === 'ENOENT');
};

this.create = function(port, path, duration, connections) {
const args = ['-d', duration, '-c', connections, '-j', '-n',
`http://127.0.0.1:${port}${path}` ];
var child = child_process.spawn(autocannon_exe, args);
child.stdout.setEncoding('utf8');
return child;
};

this.processResults = function(output) {
let result;
try {
result = JSON.parse(output);
} catch (err) {
// Do nothing, let next line handle this
}
if (!result || !result.requests || !result.requests.average) {
return undefined;
} else {
return result.requests.average;
}
};
}

function WrkBenchmarker() {
this.name = 'wrk';

this.present = function() {
var result = child_process.spawnSync('wrk', ['-h']);
return !(result.error && result.error.code === 'ENOENT');
};

this.create = function(port, path, duration, connections) {
const args = ['-d', duration, '-c', connections, '-t', 8,
`http://127.0.0.1:${port}${path}` ];
var child = child_process.spawn('wrk', args);
child.stdout.setEncoding('utf8');
child.stderr.pipe(process.stderr);
return child;
};

const regexp = /Requests\/sec:[ \t]+([0-9\.]+)/;
this.processResults = function(output) {
const match = output.match(regexp);
const result = match && +match[1];
if (!result) {
return undefined;
} else {
return result;
}
};
}

const http_benchmarkers = [ new AutocannonBenchmarker(),
Copy link
Member

@AndreasMadsen AndreasMadsen Aug 23, 2016

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would like to see wrk be the default benchmarker, since that doesn't depend on node in any way.

new WrkBenchmarker() ];

var default_http_benchmarker;
var supported_http_benchmarkers = [];
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Use const

var installed_http_benchmarkers = [];
var benchmarkers = {};

http_benchmarkers.forEach((benchmarker) => {
const name = benchmarker.name;
const present = benchmarker.present();
benchmarkers[name] = {
benchmarker: benchmarker,
present: present,
default: false
};

supported_http_benchmarkers.push(name);
if (present) {
if (!default_http_benchmarker) {
Copy link
Member

@AndreasMadsen AndreasMadsen Aug 23, 2016

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this is easier to read if moved out of the forEach, such it is:

if (process.env.NODE_HTTP_BENCHMARKER) {
  default_http_benchmarker = installed_http_benchmarkers[
    process.env.NODE_HTTP_BENCHMARKER
  ];
} else {
  default_http_benchmarker = installed_http_benchmarkers[
    Object.keys(installed_http_benchmarkers)[0]
  ];
}
if (default_http_benchmarker) {
  default_http_benchmarker.default = true;
}

default_http_benchmarker = name;
benchmarkers[name].default = true;
}
installed_http_benchmarkers.push(name);
}
});

function getBenchmarker(name) {
const benchmarker = benchmarkers[name];
if (!benchmarker) {
throw new Error(`benchmarker '${name}' is not supported`);
}
if (!benchmarker.present) {
throw new Error(`benchmarker '${name}' is not installed`);
}
return benchmarker.benchmarker;
}

if (process.env.NODE_HTTP_BENCHMARKER) {
const requested = process.env.NODE_HTTP_BENCHMARKER;
try {
default_http_benchmarker = requested;
getBenchmarker(requested);
} catch (err) {
console.error('Error when overriding default http benchmarker: ' +
err.message);
process.exit(1);
}
}

exports.run = function(options, callback) {
options = Object.assign({
port: 1234,
path: '/',
connections: 100,
duration: 10,
benchmarker: default_http_benchmarker
}, options);
if (!options.benchmarker) {
console.error('Could not locate any of the required http benchmarkers' +
'Check benchmark/README.md for further instructions.');
process.exit(1);
}
const benchmarker = getBenchmarker(options.benchmarker);

const benchmarker_start = process.hrtime();

var child = benchmarker.create(options.port, options.path, options.duration,
options.connections);

let stdout = '';
child.stdout.on('data', (chunk) => stdout += chunk.toString());

child.once('close', function(code) {
const elapsed = process.hrtime(benchmarker_start);
if (code) {
if (stdout === '') {
console.error(`${options.benchmarker} failed with ${code}`);
} else {
console.error(`${options.benchmarker} failed with ${code}. Output:`);
console.error(stdout);
}
process.exit(1);
}

var result = benchmarker.processResults(stdout);
if (!result) {
console.error(`${options.benchmarker} produced strange output`);
console.error(stdout);
process.exit(1);
}

callback(options.benchmarker, result, elapsed);
});

};

exports.default_http_benchmarker = default_http_benchmarker;
exports.supported_http_benchmarkers = supported_http_benchmarkers;
exports.installed_http_benchmarkers = installed_http_benchmarkers;

6 changes: 3 additions & 3 deletions benchmark/http/chunked.js
Original file line number Diff line number Diff line change
Expand Up @@ -20,8 +20,6 @@ function main(conf) {
const http = require('http');
var chunk = Buffer.alloc(conf.size, '8');

var args = ['-d', '10s', '-t', 8, '-c', conf.c];

var server = http.createServer(function(req, res) {
function send(left) {
if (left === 0) return res.end();
Expand All @@ -34,7 +32,9 @@ function main(conf) {
});

server.listen(common.PORT, function() {
bench.http('/', args, function() {
bench.http({
connections: conf.c
}, function() {
server.close();
});
});
Expand Down
6 changes: 4 additions & 2 deletions benchmark/http/cluster.js
Original file line number Diff line number Diff line change
Expand Up @@ -27,9 +27,11 @@ function main(conf) {

setTimeout(function() {
var path = '/' + conf.type + '/' + conf.length;
var args = ['-d', '10s', '-t', 8, '-c', conf.c];

bench.http(path, args, function() {
bench.http({
path: path,
connections: conf.c
}, function() {
w1.destroy();
w2.destroy();
});
Expand Down
Loading