Releases: brendanhasz/probflow
Releases · brendanhasz/probflow
Version 2.4.1
- @stnkl fixed a bug in
probflow.utils.ops.rand_rademacher
(it was previously generating all 0s 😬) - Fixed some bugs in tests relating to the newest version of PyTorch.
Version 2.4.0
- Allow EarlyStopping to just take a MonitorMetric or MonitorELBO directly
- Remove
expected_calibration_error
, and replace with a more generalcalibration_metric
, which can compute any of several different calibration metrics (like mean squared calibration error, mean absolute calibration error, miscalibration area, etc). - Add
sharpness
anddispersion_metric
methods toContinuousModel
(secondary uncertainty estimate metrics) - Write the callbacks user guide section
- Some minor callbacks improvements
- Add
CenteredParameter
, which creates a vector of parameters which is constrained to have a mean of 0 (or a matrix whose rows and/or columns are constrained to have means of 0)
Version 2.3.0
- Add
batch_norm
keyword argument toDenseNetwork
, which can be eitherTrue
(use batch normalization between layers) orFalse
(do not use batch normalization between layers, the default) - Add
batch_norm_loc
keyword argument toDenseNetwork
, which can be either'after'
(apply batch normalization after each layer's activation function) or'before'
(apply batch normalization before each layer's activation function)
Version 2.2.1
- Implement
probflow.distributions.Mixture
for PyTorch.
Version 2.2.0
- Add the
n_mc
kwarg toprobflow.Model.fit
, which sets the number of monte carlo samples which are taken per batch. This performs parameter updates using average of the gradients across multiple MC samples per batch. It's slower with more samples, but leads to more stable fitting.
Version 2.1.2
- Add support for flipout with the PyTorch backend.
- Add
randn
,rand_rademacher
, andshape
backend-independent ops - Update deprecated
tfp.python.math.random_rademacher
in favor oftfp.random.rademacher
when possible
Version 2.1.1
- Add KL divergence between Discrete distribution and other continuous distributions for PyTorch (fitting models with Deterministic parameters previously wasn't working for PyTorch)
- Refactor tests - they're nice and clean now 😊
- Add autoflake to dev stack
Version 2.1.0
- Add a
probabilistic
keyword argument toDense
,DenseRegression
, andEmbedding
modules. - Update
MonitorMetric
andMonitorELBO
to also track walltime - Add calibration methods to
ContinuousModel
:calibration_curve
,calibration_curve_plot
, andexpected_calibration_error
- Add support for
MultivariateNormalParameter
for PyTorch (by implementingprobflow.utils.ops.log_cholesky_transform
for pytorch) - Add a Neural linear example which uses most of these new features
Version 2.0.0
- Fixed pytorch-only import
- Added testing matrix for python versions and pytorch/tensorflow
2.0.0.a3
- Uses TensorFlow graph via tf.function - faster fitting!
- Model saving and loading (using cloudpickle)
- Fix a plotting error caused by new version of numpy/pandas
- Update TensorFlow and TensorFlow Probability version dependencies
- Add some docs and fix some other docs