Pytorch implementation of Sylvester normalizing flows, based on our paper:
Sylvester normalizing flows for variational inference (UAI 2018)
Rianne van den Berg*, Leonard Hasenclever*, Jakub Tomczak, Max Welling
*Equal contribution
The code is compatible with:
pytorch 1.1
WARNING: More recent versions of pytorch have different default flags for the binary cross entropy loss module: nn.BCELoss(). You have to adapt the appropriate flags if you want to port this code to a later version.
I hope I solved this issue correctlypython 3.7
The experiments can be run on the following datasets:
- static MNIST: dataset is in data folder;
- OMNIGLOT: the dataset can be downloaded from link;
- Caltech 101 Silhouettes: the dataset can be downloaded from link.
- Frey Faces: the dataset can be downloaded from link.
Below, example commands are given for running experiments on static MNIST with different types of Sylvester normalizing flows, for 4 flows:
Orthogonal Sylvester flows
This example uses a bottleneck of size 8 (Q has 8 columns containing orthonormal vectors).
python main_experiment.py -d mnist -nf 4 --flow orthogonal --num_ortho_vecs 8
Householder Sylvester flows
This example uses 8 Householder reflections per orthogonal matrix Q.
python main_experiment.py -d mnist -nf 4 --flow householder --num_householder 8
Triangular Sylvester flows
python main_experiment.py -d mnist -nf 4 --flow triangular
To run an experiment with other types of normalizing flows or just with a factorized Gaussian posterior, see below.
Factorized Gaussian posterior
python main_experiment.py -d mnist --flow no_flow
Planar flows
python main_experiment.py -d mnist -nf 4 --flow planar
Inverse Autoregressive flows
This examples uses MADEs with 320 hidden units.
python main_experiment.py -d mnist -nf 4 --flow iaf --made_h_size 320
More information about additional argument options can be found by running ```python main_experiment.py -h```
Please cite our paper if you use this code in your own work:
@inproceedings{vdberg2018sylvester,
title={Sylvester normalizing flows for variational inference},
author={van den Berg, Rianne and Hasenclever, Leonard and Tomczak, Jakub and Welling, Max},
booktitle={proceedings of the Conference on Uncertainty in Artificial Intelligence (UAI)},
year={2018}
}