Skip to content
This repository has been archived by the owner on Apr 16, 2020. It is now read-only.

Latest commit

 

History

History
118 lines (75 loc) · 3.54 KB

README.md

File metadata and controls

118 lines (75 loc) · 3.54 KB

This repository has been archived!

This IPFS-related repository has been archived, and all issues are therefore frozen. If you want to ask a question or open/continue a discussion related to this repo, please visit the official IPFS forums.

We archive repos for one or more of the following reasons:

  • Code or content is unmaintained, and therefore might be broken
  • Content is outdated, and therefore may mislead readers
  • Code or content evolved into something else and/or has lived on in a different place
  • The repository or project is not active in general

Please note that in order to keep the primary IPFS GitHub org tidy, most archived repos are moved into the ipfs-inactive org.

If you feel this repo should not be archived (or portions of it should be moved to a non-archived repo), please reach out and let us know. Archiving can always be reversed if needed.


ipfs-performance-profiling

js-standard-style

Benchmarking tests for js-ipfs, using go-ipfs as a baseline.

Install

$ git clone https://github.com/ipfs/ipfs-performance-profiling.git
$ cd ipfs-performance-profiling
$ npm install

Run

Run all benchmarks on all environments:

$ npm run benchmarks

Run all benchmarks on the go and js-core environments:

$ npm run benchmarks -- --envs=go,js-core

Available environments are:

  • go
  • js-core
  • js-http

Run named benchmark on the js-http environment:

$ npm run benchmarks -- files-add-1MB-file --envs=js-http

JSON output

You can output a JSON report using the --json options:

$ npm run benchmarks -- files-add-1MB-file --json

Reports

You run and produce an HTML report using:

$ npm run benchmarks:report

Report with profiling data

You run and produce an HTML report containing links to profiling data using:

$ npm run benchmarks:report:profile -- [<suite>] --envs=[env1,env2]

Creating a benchmark suite

A benchmark suite is simply a function that gets two arguments: an IPFS client object and a callback:

module.exports = function (ipfs, callback) {
  ipfs.files.add([{
    path: 'a.txt',
    content: new Buffer('a')
  }], callback)
}

Add it under its own dir under src/suites. Also, don't forget to add an entry to src/suites/index.js so that it can be found.

Logging

The suite runner uses the stdout channel for the benchmark results. If you want to log to the console, use console.error instead.

Contribute

Feel free to join in. All welcome. Open an issue!

This repository falls under the IPFS Code of Conduct.

License

MIT