Skip to content

Commit c8a3615

Browse files
committed
Update README.md
1 parent dea8883 commit c8a3615

File tree

4 files changed

+99
-11
lines changed

4 files changed

+99
-11
lines changed

README.md

Lines changed: 90 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -1,23 +1,102 @@
1-
# Denoising Diffusion Variational Inference: Diffusion Models as Expressive Variational Posteriors
1+
# [Denoising Diffusion Variational Inference (DDVI): Diffusion Models as Expressive Variational Posteriors](https://arxiv.org/abs/2401.02739)
22

3-
Accepted to AAAI 2025
3+
By [Wasu Top Piriyakulkij*](https://www.cs.cornell.edu/~wp237/), [Yingheng Wang*](https://isjakewong.github.io/), [Volodymyr Kuleshov](https://www.cs.cornell.edu/~kuleshov/) (* denotes equal contribution)
44

5-
Arxiv link: [https://arxiv.org/abs/2401.02739](https://arxiv.org/abs/2401.02739)
5+
[![arXiv](https://img.shields.io/badge/arXiv-2401.02739-red.svg)](https://arxiv.org/abs/2401.02739)
6+
7+
Accepted to [AAAI 2025](https://aaai.org/conference/aaai/aaai-25/)
8+
9+
<img src="imgs/ddvi.png" width="99%">
10+
<img src="imgs/mnist_results.png" width="99%">
11+
12+
We propose denoising diffusion variational inference (DDVI), a black-box variational inference algorithm for latent variable models which relies on diffusion models as flexible approximate posteriors. Specifically, our method introduces an expressive class of diffusion-based variational posteriors that perform iterative refinement in latent space; we train these posteriors with a novel regularized evidence lower bound (ELBO) on the marginal likelihood inspired by the wake-sleep algorithm. Our method is easy to implement (it fits a regularized extension of the ELBO), is compatible with black-box variational inference, and outperforms alternative classes of approximate posteriors based on normalizing flows or adversarial networks. We find that DDVI improves inference and learning in deep latent variable models across common benchmarks as well as on a motivating task in biology -- inferring latent ancestry from human genomes -- where it outperforms strong baselines on the Thousand Genomes dataset.
613

714
# Installation
815
```
16+
conda create -n ddvi python=3.7
17+
conda activate ddvi
918
pip install -r requirements.txt
1019
```
1120

12-
# Run
21+
# Running DDVI
22+
23+
You can run the experiments by calling `run.sh` which takes three arguments: dataset, learning algorithm, and prior respectively
24+
25+
Unsupervised learning on MNIST with DDVI
26+
```
27+
./run.sh mnist diff_vae_warmup pinwheel
28+
./run.sh mnist diff_vae_warmup swiss_roll
29+
./run.sh mnist diff_vae_warmup less_noisy_square
30+
```
31+
32+
Unsupervised learning on CIFAR with DDVI
33+
```
34+
./run.sh cifar diff_vae_warmup pinwheel
35+
./run.sh cifar diff_vae_warmup swiss_roll
36+
./run.sh cifar diff_vae_warmup less_noisy_square
37+
```
38+
39+
Semi-supervised learning on MNIST with DDVI
40+
```
41+
./run.sh mnist_semi diff_vae_warmup_semi pinwheel
42+
./run.sh mnist_semi diff_vae_warmup_semi swiss_roll
43+
./run.sh mnist_semi diff_vae_warmup_semi less_noisy_square
44+
```
45+
46+
Semi-supervised learning on CIFAR with DDVI
47+
```
48+
./run.sh cifar_semi diff_vae_warmup_semi pinwheel
49+
./run.sh cifar_semi diff_vae_warmup_semi swiss_roll
50+
./run.sh cifar_semi diff_vae_warmup_semi less_noisy_square
51+
```
52+
53+
# Running baselines
54+
55+
Available unsupervised learning baselines are [vae, iaf_vae, h_iaf_vae, aae]
56+
57+
Unsupervised learning on MNIST with baselines
58+
```
59+
for method in vae iaf_vae h_iaf_vae aae; do
60+
./run.sh mnist $method pinwheel
61+
./run.sh mnist $method swiss_roll
62+
./run.sh mnist $method less_noisy_square
63+
done
64+
```
1365

14-
Bash script to run DDVI (model=diff_vae_warmup):
66+
Unsupervised learning on CIFAR with baselines
1567
```
16-
dataset=mnist # Options are [mnist, cifar]
17-
model=diff_vae_warmup # Options are [vae, iaf_vae, h_iaf_vae, diff_vae_warmup]
18-
prior=pin_wheel # Options are [pinwheel, swiss_roll, less_noisy_square]
68+
for method in vae iaf_vae h_iaf_vae aae; do
69+
./run.sh cifar $method pinwheel
70+
./run.sh cifar $method swiss_roll
71+
./run.sh cifar $method less_noisy_square
72+
done
73+
```
74+
75+
Available unsupervised learning baselines are [vae_semi, iaf_vae_semi, aae_semi]
1976

20-
mkdir experimental_results/
21-
mkdir experimental_results/$dataset_$model_$prior
22-
python run.py dataset=$dataset model=$model prior=$prior save_folder=${save_folder}/run0 seed=0
77+
Semi-supervised learning on MNIST with baselines
78+
```
79+
for method in vae_semi iaf_vae_semi aae_semi; do
80+
./run.sh mnist_semi $method pinwheel
81+
./run.sh mnist_semi $method swiss_roll
82+
./run.sh mnist_semi $method less_noisy_square
83+
done
84+
```
85+
86+
Semi-supervised learning on CIFAR with baselines
87+
```
88+
for method in vae_semi iaf_vae_semi aae_semi; do
89+
./run.sh cifar_semi $method pinwheel
90+
./run.sh cifar_semi $method swiss_roll
91+
./run.sh cifar_semi $method less_noisy_square
92+
done
93+
```
94+
95+
# Citation
96+
```
97+
@inproceedings{piriyakulkij-wang:aaai25,
98+
Author = {Piriyakulkij, Wasu Top and Wang, Yingheng and Kuleshov, Volodymyr},
99+
Booktitle = {Proceedings of the AAAI Conference on Artificial Intelligence},
100+
Title = {Denoising Diffusion Variational Inference: Diffusion Models as Expressive Variational Posteriors},
101+
Year = {2025}}
23102
```

imgs/ddvi.png

150 KB
Loading

imgs/mnist_results.png

496 KB
Loading

run.sh

Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,9 @@
1+
#!/bin/bash
2+
3+
dataset=$1
4+
model=$2
5+
prior=$3
6+
7+
mkdir experimental_results/
8+
mkdir experimental_results/$dataset_$model_$prior
9+
python run.py dataset=$dataset model=$model prior=$prior save_folder=${save_folder}/run0 seed=0

0 commit comments

Comments
 (0)