Skip to content

Releases: stefan-m-lenz/BoltzmannMachines.jl

v1.3.0

13 Oct 10:46
Compare
Choose a tag to compare

New features:

  • The batchsize can be specified also for fine tuning in fitdbm and traindbm!. The batchsize can be specified seperately for fine tuning and pretraining via the arguments batchsizefinetuning (new) and batchsizepretraining in fitdbm.
  • Added function top2latentdims, enabling a convenient dimension reduction with DBMs
  • Added a new example for using DBMs for dimension reduction
  • Added function blocksinnoise, simulating data sets with different subgroups and labels
  • Migrated continuous integration from Travis CI to GitHub Actions

Bug fix:

  • Argument optimizerpretraining in fitdbm is now respected. (Previously only the optimizer argument was used.)

Deprecation:

  • Argument learningrates in fitdbm is renamed to learningratesfinetuning for clarity

v1.2.0

16 Aug 14:12
Compare
Choose a tag to compare
  • More convenient monitoring with the functions monitored_fitrbm, monitored_stackrbms, monitored_traindbm! and monitored_fitdbm
  • Added functions intensities_encode and intensities_decode to transform continuous data into the interval [0,1] and back
  • Added examples for using Softmax0BernoulliRBMs
  • Added examples for using partitioned layers in MultimodalDBMs
  • Compatibility with the JuliaConnectoR: Avoid using copies of anonymous functions when preparing the TrainLayers
  • For reproducibility, the RBMs in partitioned layers are not trained in parallel processes any more when Julia runs multiple processes
  • Fixed documentation of splitdata
  • Other small improvements in the documentation
  • Removed dependency on "Distributions": BoltzmannMachines is now only depending on packages in stdlib.

v1.1.0

15 Feb 09:58
Compare
Choose a tag to compare

New features:

  • Modelling categorical data in RBMs and DBMs using the new type of Softmax0BernoulliRBMs (complete with likelihood monitoring)
  • Conditional sampling via gibbssamplecond! and samples (see examples)
  • Data preprocessing functions: intensities, oneornone_encode

Other changes:

  • Longer default burnin when estimating the empiricalloglikelihood
  • Removed deprecated function addlayer!

v1.0.1

17 Jan 09:46
Compare
Choose a tag to compare

Fixes issues in monitoring multimodal DBMs and in logproblowerbound

v1.0.0

16 Aug 13:46
Compare
Choose a tag to compare

Upgrade to Julia 1.0
Added Travis CI testing

v0.1.0

04 Aug 20:51
Compare
Choose a tag to compare

First version for official release