Skip to content

Commit

Permalink
Support for "Comfy" lora format.
Browse files Browse the repository at this point in the history
The keys are just: model.full.model.key.name.lora_up.weight

It is supported by all comfyui supported models.

Now people can just convert loras to this format instead of having to ask
for me to implement them.
  • Loading branch information
comfyanonymous committed Aug 7, 2024
1 parent c19dcd3 commit 17030fd
Showing 1 changed file with 1 addition and 0 deletions.
1 change: 1 addition & 0 deletions comfy/lora.py
Original file line number Diff line number Diff line change
Expand Up @@ -245,6 +245,7 @@ def model_lora_keys_unet(model, key_map={}):
key_lora = k[len("diffusion_model."):-len(".weight")].replace(".", "_")
key_map["lora_unet_{}".format(key_lora)] = k
key_map["lora_prior_unet_{}".format(key_lora)] = k #cascade lora: TODO put lora key prefix in the model config
key_map["model.{}".format(k[:-len(".weight")])] = k #generic lora format without any weird key names

diffusers_keys = comfy.utils.unet_to_diffusers(model.model_config.unet_config)
for k in diffusers_keys:
Expand Down

0 comments on commit 17030fd

Please sign in to comment.