Skip to content

A powerful Python framework for writing and running portable regression tests and benchmarks for HPC systems.

License

Notifications You must be signed in to change notification settings

Blanca-Fuentes/reframe

 
 

Repository files navigation

Build Status Documentation Status codecov
GitHub release (latest by date including pre-releases) GitHub commits since latest release GitHub contributors
PyPI version PyPI - Python Version Downloads Downloads
Slack License DOI
Twitter Follow

ReFrame in a Nutshell

ReFrame is a powerful framework for writing system regression tests and benchmarks, specifically targeted to HPC systems. The goal of the framework is to abstract away the complexity of the interactions with the system, separating the logic of a test from the low-level details, which pertain to the system configuration and setup. This allows users to write portable tests in a declarative way that describes only the test's functionality.

Tests in ReFrame are simple Python classes that specify the basic variables and parameters of the test. ReFrame offers an intuitive and very powerful syntax that allows users to create test libraries, test factories, as well as complete test workflows using other tests as fixtures. ReFrame will load the tests and send them down a well-defined pipeline that will execute them in parallel. The stages of this pipeline take care of all the system interaction details, such as programming environment switching, compilation, job submission, job status query, sanity checking and performance assessment.

Please visit the project's documentation page for all the details!

Installation

ReFrame is fairly easy to install. All you need is Python 3.6 or above and to run its bootstrap script:

git clone https://github.com/reframe-hpc/reframe.git
cd reframe
./bootstrap.sh
./bin/reframe -V

If you want a specific release, please refer to the documentation page.

Running the unit tests

You can optionally run the framework's unit tests with the following command:

./test_reframe.py -v

NOTE: Unit tests require a POSIX-compliant C compiler (available through the cc command), as well as the make utility.

Building the documentation locally

You may build the documentation of the master manually as follows:

./bootstrap.sh +docs

For viewing it, you may do the following:

cd docs/html
python3 -m http.server

The documentation is now up on localhost:8000, where you can navigate with your browser.

Test library

The framework comes with an experimental library of tests that users can either run from the command line directly or extend and fine tune them for their systems. See here for more details.

Public test repositories

The ReFrame HPC community Github page provides mirror forks of interesting ReFrame test repositories maintained by various sites or projects. You can use those tests as an additional guidance to implement your own tests.

If you maintain a public test repository and you would like it to be listed in the community page, feel free to open an issue or contact us through Slack.

Contact

You can get in contact with the ReFrame community in the following ways:

Slack

Please join the community's Slack channel for keeping up with the latest news about ReFrame, posting questions and, generally getting in touch with other users and the developers.

Contributing back

ReFrame is an open-source project and we welcome and encourage contributions! Check out our Contribution Guide here.

Citing ReFrame

You can cite ReFrame in publications as follows:

Vasileios Karakasis et al. "Enabling Continuous Testing of HPC Systems Using ReFrame". In: Tools and Techniques for High Performance Computing. HUST - Annual Workshop on HPC User Support Tools (Denver, Colorado, USA, Nov. 17–18, 2019). Ed. by Guido Juckeland and Sunita Chandrasekaran. Vol. 1190. Communications in Computer and Information Science. Cham, Switzerland: Springer International Publishing, Mar. 2020, pp. 49–68. isbn: 978-3-030-44728-1. doi: 10.1007/978-3-030-44728-1_3.

The corresponding BibTeX entry is the following:

@InProceedings{karakasis20a,
  author     = {Karakasis, Vasileios and Manitaras, Theofilos and Rusu, Victor Holanda and
                Sarmiento-P{\'e}rez, Rafael and Bignamini, Christopher and Kraushaar, Matthias and
                Jocksch, Andreas and Omlin, Samuel and Peretti-Pezzi, Guilherme and
                Augusto, Jo{\~a}o P. S. C. and Friesen, Brian and He, Yun and Gerhardt, Lisa and
                Cook, Brandon and You, Zhi-Qiang and Khuvis, Samuel and Tomko, Karen},
  title      = {Enabling Continuous Testing of {HPC} Systems Using {ReFrame}},
  booktitle  = {Tools and Techniques for High Performance Computing},
  editor     = {Juckeland, Guido and Chandrasekaran, Sunita},
  year       = {2020},
  month      = mar,
  series     = {Communications in Computer and Information Science},
  volume     = {1190},
  pages      = {49--68},
  address    = {Cham, Switzerland},
  publisher  = {Springer International Publishing},
  doi        = {10.1007/978-3-030-44728-1_3},
  venue      = {Denver, Colorado, USA},
  eventdate  = {2019-11-17/2019-11-18},
  eventtitle = {{HUST} - Annual Workshop on {HPC} User Support Tools},
  isbn       = {978-3-030-44728-1},
  issn       = {1865-0937},
}

About

A powerful Python framework for writing and running portable regression tests and benchmarks for HPC systems.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 99.0%
  • Cuda 0.4%
  • C++ 0.3%
  • Shell 0.1%
  • Dockerfile 0.1%
  • Jupyter Notebook 0.1%