Skip to content

Latest commit

 

History

History
32 lines (20 loc) · 1.93 KB

README.md

File metadata and controls

32 lines (20 loc) · 1.93 KB

Deep Generative Models through the Lens of the Manifold Hypothesis: A Survey and New Connections

This is the codebase accompanying the paper "Deep Generative Models through the Lens of the Manifold Hypothesis: A Survey and New Connections", accepted to TMLR in September 2024. This codebase is based on the codebase from our previous paper, "Diagnosing and Fixing Manifold Overfitting in Deep Generative Models", please refer to the original codebase for setup and general usage instructions (the updates include adversarially-regularized VAEs, score-based diffusion models, and FD_DINOv2). Here we discuss how to run the experiments in Section 5.3.2 of the paper, please note that the codebase has many other functionalities inherited from our previous codebase.

Training the models and plotting the score norms

In order to train the latent diffusion model, run

./main.py --dataset cifar10 --gae-model adv_vae --de-model diffusion

and in order to train the diffusion model on ambient space, run

./single_main.py --dataset cifar10 --model diffusion

The resulting models will be automatically saved in the runs/ directory. Once the models are trained, the notebook score_norms.ipynb in the notebooks directory can be used to reproduce Figure 8.

BibTeX

@article{
    loaiza-ganem2024deep,
    title={Deep Generative Models through the Lens of the Manifold Hypothesis: A Survey and New Connections},
    author={Loaiza-Ganem, Gabriel and Ross, Brendan Leigh and Hosseinzadeh, Rasa and Caterini, Anthony L and Cresswell, Jesse C},
    journal={Transactions on Machine Learning Research},
    year={2024}
}