Skip to content

Releases: IBM/mi-prometheus

0.3.1 release

29 Nov 00:40
Compare
Choose a tag to compare

What's new:

  • Add a Reproducible Research section in the documentation, which has the purpose to gather information about our published work and how to reproduce the associated experiments. To open up this new section, a full-fledged, self-contained section about the experiments of our ViGIL paper is available. See #105
    This section is possible thanks to the following changes & addons:
    - Support for multiple tests in the Tester (cf #98),
    - Can delete keys in the ParamRegistry (#100)

  • Documentation restructuration (#93), to better reflect the organization of the code repo,

  • Add a new helper, ProblemInitializer, which will, at term, handle checking the existence of the data files for a particular Problem, and download them if non-existing.

0.3.0 release

13 Nov 01:30
f0be173
Compare
Choose a tag to compare

This release adresses several bugs and brings new features:

What's new?

  • The entire pipeline of the Grid Workers has been reviewed and should now be working properly on Ubuntu CPUs/GPUs & MacOS. Ref #81

  • The documentation build is now fixed, as all modules & inherited classes are mocked. Thus, no need to handle the dependencies anymore (through setup.py or requirements.txt). Ref #55
    The documentation also now handles linking to other docs, such as PyTorch's and NumPy. Ref #56

  • Introducing miprometheus.helpers: classes that can do a specific task (like splitting in 2 a range of indices) prior to using a worker. Ref #36

  • SamplerFactory: hooks on torch.utils.data.samplers to instantiate them from the workers. Ref #32

  • Integration of LGTM for code analysis and fixing some of the raised warnings. Ref #45, #57, #43, #44

  • Added LeNet5 and cleaned the Image Classification problems. Ref #53

  • Plenty of fixes in the workers: #54, #72, #81

  • Some refactoring in VideoToClassProblem and SequentialMNIST: Ref #71

  • The trainers can load a pretrained model. Ref #42

Misc:

#28, #33, #38, #70, #88

0.2.1 release

24 Oct 02:59
9a34a04
Compare
Choose a tag to compare

setup.py

This release adds the setup.py script which should ease the install of MI-Prometheus. The user should now be able to run python setup.py install from the cloned repository to install it.
This indicates that the workers/, problems/, models/, utils/ packages have been moved under the now root miprometheus package.
The setup.py script also registers aliases for the workers, which can now be executed as regular linux command: mip-offline-trainer, mip-online-trainer, mip-tester, mip-grid-trainer-cpu, mip-grid-trainer-gpu, mip-grid-tester-cpu, mip-grid-tester-gpu, mip-grid-analyzer are available.

Documentation Build

This release also adresses the documentation build, which should be now working. The documentation is available on readthedocs.io.

Miscelleanous

  • The trainer will put a warning if the user tries to execute it,
  • The base workers (offline trainer - online trainer - tester) will exit if the user specifies --gpu but no CUDA devices are available.

The Grid Workers include more safety check to avoid confusing the user, and have a more consistent behavior:

  • The GPU workers check if cuda-gpupick is available, and if not, do not use it.
  • The use of the flag max_concurrent_run is standardized and represent how many concurrent experiments will be simultaneously run.
  • The data paths for MNIST & CIFAR are correctly handled if indicated as relative.

0.2.0 release

19 Oct 02:26
Compare
Choose a tag to compare

Cleanups and fixes, renaming of some entities (e..g grid_workers) so they are consistent with the paper etc.
A very quick overlook of the changes are:

  • Replaced DataTuple with DataDict,
  • Redesigned the workers from the scratch, using class inheritance,
  • Refactored and cleaned up most of the models & problems.

0.1.0 release

06 Oct 01:09
040b889
Compare
Choose a tag to compare
0.1.0 release Pre-release
Pre-release

The first open-source release, version released along with the publication of associated papers