Skip to content

Experimentation procedure for Iterative Optimization Heuristics

License

Notifications You must be signed in to change notification settings

nojhan/IOHexperimenter

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

IOHprofiler: IOHexperimenter

This is the benchmarking platform for Iterative Optimization Heuristics (IOHs).

IOHexperimenter provides:

  • A framework for straightforward benchmarking of any iterative optimization heuristic
  • A suite consisting of 23 pre-made Pseudo-Boolean benchmarking function, with easily accessible methods for adding custom functions and suites
  • Logging methods to effortlesly store benchmarking data in a format compatible with IOHanalyzer, with future support for additional data logging options
  • (Soon to come:) A framework which significantly simplifies algorithm design

IOHexperimenter is built on:

  • C++ (tested on gcc 5.4.0)
  • boost.filesystem library for logging files.

IOHexperimenter is available for:

Using IOHexperimenter

Running Experiments

The IOHexperimenter has been built on C++ and tested on complier gcc 5.4.0. To use the logging of csv output files, boost.filesystem library is required (To install boost library, please visit https://www.boost.org).

Using IOHexperimenter in C++

If you are using the tool for the first time, please download or clone this branch and run make at the root directory of the project. After running make to compile,

  • object files will be generated in build/c/obj
  • three exectuable files will be generated in build/c/bin

Afterwards, you can use the folder build/c and use the Makefile therein for your experiments. For more details of how to use the C++ version, please visit this page.

Using IOHexperimenter in R

To use the IOHexperimenter within R, please visit the R branch of this repository.

Creating test problems

Benchmarking problems in IOHexperimenter are easy to create yourself. We provide support for any input type and any number of real-valued objectives. For a more detailed guidline of how to define a benchmarking problem within IOHexperimenter, please visit this page.

Configuring test suites

Suites are collections of benchmarking problems. By including problems into a suite, it is easier for users to maintain their experiments. If you create a set of similar problems, it is recommended to create a suite to collect them together, which can be done effortlesly within the IOHexperimenter. For detailed steps of creating and using suites, please visit this page.

Contact

If you have any questions, comments or suggestions, please don't hesitate contacting us IOHprofiler@liacs.leidenuniv.nl!

Our team

  • Furong Ye, Leiden Institute of Advanced Computer Science,
  • Diederick Vermetten, Leiden Institute of Advanced Computer Science,
  • Hao Wang, Leiden Institute of Advanced Computer Science,
  • Carola Doerr, CNRS and Sorbonne University,
  • Thomas Bäck, Leiden Institute of Advanced Computer Science,

When using IOHprofiler and parts thereof, please kindly cite this work as

Carola Doerr, Hao Wang, Furong Ye, Sander van Rijn, Thomas Bäck: IOHprofiler: A Benchmarking and Profiling Tool for Iterative Optimization Heuristics, arXiv e-prints:1810.05281, 2018.

@ARTICLE{IOHprofiler,
  author = {Carola Doerr and Hao Wang and Furong Ye and Sander van Rijn and Thomas B{\"a}ck},
  title = {{IOHprofiler: A Benchmarking and Profiling Tool for Iterative Optimization Heuristics}},
  journal = {arXiv e-prints:1810.05281},
  archivePrefix = "arXiv",
  eprint = {1810.05281},
  year = 2018,
  month = oct,
  keywords = {Computer Science - Neural and Evolutionary Computing},
  url = {https://arxiv.org/abs/1810.05281}
}

About

Experimentation procedure for Iterative Optimization Heuristics

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • C++ 92.2%
  • R 4.7%
  • Makefile 1.3%
  • SWIG 1.3%
  • C 0.5%