This repository contains the official pytorch implementation of the paper: "Diffusion Model Patching via Mixture-of-Prompts".
- 2024.12.10: DMP is accepted to AAAI 2025!
- 2024.08.19: Resolve minor errors.
- 2024.05.29: Build project page.
- 2024.05.28: Code Release.
Generated sample (golden retriever) from DiT-XL/2 + DMP (w/ cfg=1.5).
Generated sample (goldfish) from DiT-XL/2 + DMP (w/ cfg=1.5).
Generated sample (ostrich) from DiT-XL/2 + DMP (w/ cfg=1.5).
We use a 80GB A100 GPU for all experiments.
conda create -n ENV_NAME python=3.10
python3 -m pip install -r requirements.txt
We provide an example training script for ImageNet.
torchrun --nnodes=1 --nproc_per_node=1 train.py general.data_path='<PATH_TO_DATASET>'
You can also modify the DiT model, optimization type, sharing ratio, etc.
torchrun --nnodes=1 --nproc_per_node=1 train.py \
general.data_path='<PATH_TO_DATASET>' \
models.name="DiT-L/2" \
models.routing.sharing_ratio=0.8
After training, the checkpoint and log files are saved based on the configuration. Consequently, you need to execute the sampling script using the same configuration as the training script. Additionally, you can adjust the number of sampling images and the classifier-guidance scale.
torchrun --nnodes=1 --nproc_per_node=1 sample_ddp.py \
models.name="DiT-L/2" \
eval.cfg_scale=1.5 \
eval.num_fid_samples=50000
Please refer to the example scripts for detailed instructions how to reproduce our results. In this script, we enumerate the configurations that can be modified if needed.
Evaluating pre-trained diffusion models with different further training methods. Importantly, we use the same dataset as in the pre-training for further training. We set two baselines for comparison: (1) full fine-tuning to update the entire model parameters. (2) naive prompt tuning.@article{ham2024diffusion,
title={Diffusion Model Patching via Mixture-of-Prompts},
author={Ham, Seokil and Woo, Sangmin and Kim, Jin-Young and Go, Hyojun and Park, Byeongjun and Kim, Changick},
journal={arXiv preprint arXiv:2405.17825},
year={2024}
}