Releases: gptune/GPTune
Releases · gptune/GPTune
v4.0.0 release
- Improve and rename APIs for SLA, MLA, TLA, SA and their crowd-tune interfaces.
- Update example driver scripts for SuperLU_DIST[_RCI], NIMROD, IMPACT-Z, ButterflyPACK, heffte, Hypre, etc.
- Improve the setting of random seeds.
- Improve/add TLA algorithms.
- Add support for recording and loading surrogate model info for Model_GPy_LCM.
- Add/update build/run scripts for Perlmutter, Crusher, and Summit.
- Update FAQ.
- Add the user-guide for v4.0.0.
- Fix bugs in gptuneband, batched function evaluation, duplicated samples and MLA.
- Add an example to demonstrate the use of suprocess for launching MPI applications in the lite mode.
- Add an example to support GPTune + Downhill-SIMPLEX (Nelder–Mead) for improved tuning quality.
- Integrate gptunehybrid (hybridMinimization of Hengrui Luo), and add support for handling black-box input constraints.
- Improve the sample/model/search algorithm options when the lite mode is used.
- Add support for NIX installers.
v3.0.0 release
Improved cmake files for spack build
Improved TLA algorithms for tuning across machines and/or node counts
Updated FAQ
Improved cGP interfaces
Improved historyDB functions for shared-memory parallel sampling and RCI
Added programmable APIs and examples, improved sensitivity analysis
Added support for non-equal samples sizes when Model_GPy_LCM is used -> improved TLA and GPTuneBand algorithms
Tested RCI for Perlmutter, added a few more GPU tuning examples
Added an env GPTUNE_LITE_MODE for installing and running GPTune. The lite mode removes dependencies on openturns, pygmo and openmpi>4.0.
Added SearchSciPy and SearchPyMoo for the searcher classes
Added kernel types when Model_GPy_LCM is used
v2.1.0 release
- Added more tuning examples
- Improved multi-objective tuning
- Improved transfer learning
- Improved database
- Added the cGP interface
- Added sensitivity analysis
release 2.0
2.0.0 Smoothing Splines Tuning Example for Household PowerConsumption Data