H-GLaD: Hierarchical Features Matter: A Deep Exploration of GAN Priors for Improved Dataset Distillation
H-GLaD utilizes hierarchical features to enhance the GAN-based parameterization dataset distillation method.
Below are some example commands to run each method.
Using the default hyper-parameters, you should be able to comfortable run each method on a 24GB GPU.
The following command will then use the buffers we just generated to distill imagenet-birds down to 1 image per class using StyleGAN:
python h_glad_dc.py --dataset=imagenet-birds --space=wp --ipc=1 --data_path={path_to_dataset}
The following command will then use the buffers we just generated to distill imagenet-fruit down to 1 image per class using StyleGAN:
python h_glad_dm.py --dataset=imagenet-fruits --space=wp --ipc=1 --data_path={path_to_dataset}
First you will need to create the expert trajectories.
python buffer_mtt.py --dataset=imagenet-b --train_epochs=15 --data_path={path_to_dataset}
The following command will then use the buffers we just generated to distill imagenet-b down to 1 image per class using StyleGAN:
python h_glad_mtt.py --dataset=imagenet-b --space=wp --ipc=1 --data_path={path_to_dataset}
Adding --rand_f
will initialize the f-latents with Gaussian noise.
Adding --special_gan=ffhq
or --special_gan=pokemon
will use a StyleGAN trained on FFHQ or Pokémon instead of ImageNet.
Adding --learn_g
will allow the weights of the StyleGAN to be updated along with the latent codes.
Adding --avg_w
will initialize the w-latents with the average w for the respective class.
(Do not do this if attempting to distill multiple images per class.)
If you find our code useful for your research, please cite our paper.
@article{zhong2024hierarchical,
title={Hierarchical Features Matter: A Deep Exploration of GAN Priors for Improved Dataset Distillation},
author={Zhong, Xinhao and Fang, Hao and Chen, Bin and Gu, Xulin and Dai, Tao and Qiu, Meikang and Xia, Shu-Tao},
journal={arXiv preprint arXiv:2406.05704},
year={2024}
}