📃 Paper • 🤗 Checkpoints
Our method accelerates LDMs via data-free multistep latent consistency distillation (MLCD), and data-free latent consistency distillation is proposed to efficiently guarantee the inter-segment consistency in MLCD.
Furthermore, we introduce bags of techniques, e.g., distribution matching, adversarial learning, and preference learning, to enhance TLCM’s performance at few-step inference without any real data.
TLCM demonstrates a high level of flexibility by enabling adjustment of sampling steps within the range of 2 to 8 while still producing competitive outputs compared to full-step approaches.
pip install diffusers
pip install transformers accelerate
We provide an example inference script in the directory of this repo. You should download the Lora path from here and use a base model, such as SDXL1.0 , as the recommended option. After that, you can activate the generation with the following code:
python inference.py --prompt {Your prompt} --output_dir {Your output directory} --lora_path {Lora_directory} --base_model_path {Base_model_directory} --infer-steps 4
More parameters are presented in paras.py. You can modify them according to your requirements.
Here we present some examples with different samping steps.
2-Steps Sampling
3-Steps Sampling
4-Steps Sampling
8-Steps Sampling
We also provide the latent lpips model here. More details are presented in the paper.