Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[LoRA] "Incompatible keys detected" error when open popular LoRA models from civitai #10599

Open
RIOFornium opened this issue Jan 17, 2025 · 3 comments · May be fixed by #10532
Open

[LoRA] "Incompatible keys detected" error when open popular LoRA models from civitai #10599

RIOFornium opened this issue Jan 17, 2025 · 3 comments · May be fixed by #10532
Assignees
Labels
bug Something isn't working

Comments

@RIOFornium
Copy link

Describe the bug

Hello Dears!

I tried to open the next models (Flux.1 D):
https://civitai.com/models/332248?modelVersionId=1086989
https://civitai.com/models/290836?modelVersionId=981868

But diffusers raised the error - "Incompatible keys detected"

Reproduction

import torch
from diffusers import FluxPipeline

pipe = FluxPipeline.from_pretrained('black-forest-labs/FLUX.1-dev', torch_dtype=torch.bfloat16)
pipe.load_lora_weights('Dynamic Poses V2.safetensors')

or

import torch
from diffusers import FluxPipeline

pipe = FluxPipeline.from_pretrained('black-forest-labs/FLUX.1-dev', torch_dtype=torch.bfloat16)
pipe.load_lora_weights('Multiple Views4030.safetensors')

Logs

Incompatible keys detected:
lora_transformer_single_transformer_blocks_0_attn_to_k.alpha, lora_transformer_single_transformer_blocks_0_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_0_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_0_attn_to_q.alpha, lora_transformer_single_transformer_blocks_0_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_0_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_0_attn_to_v.alpha, lora_transformer_single_transformer_blocks_0_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_0_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_10_attn_to_k.alpha, lora_transformer_single_transformer_blocks_10_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_10_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_10_attn_to_q.alpha, lora_transformer_single_transformer_blocks_10_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_10_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_10_attn_to_v.alpha, lora_transformer_single_transformer_blocks_10_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_10_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_11_attn_to_k.alpha, lora_transformer_single_transformer_blocks_11_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_11_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_11_attn_to_q.alpha, lora_transformer_single_transformer_blocks_11_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_11_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_11_attn_to_v.alpha, lora_transformer_single_transformer_blocks_11_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_11_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_12_attn_to_k.alpha, lora_transformer_single_transformer_blocks_12_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_12_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_12_attn_to_q.alpha, lora_transformer_single_transformer_blocks_12_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_12_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_12_attn_to_v.alpha, lora_transformer_single_transformer_blocks_12_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_12_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_13_attn_to_k.alpha, lora_transformer_single_transformer_blocks_13_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_13_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_13_attn_to_q.alpha, lora_transformer_single_transformer_blocks_13_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_13_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_13_attn_to_v.alpha, lora_transformer_single_transformer_blocks_13_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_13_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_14_attn_to_k.alpha, lora_transformer_single_transformer_blocks_14_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_14_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_14_attn_to_q.alpha, lora_transformer_single_transformer_blocks_14_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_14_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_14_attn_to_v.alpha, lora_transformer_single_transformer_blocks_14_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_14_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_15_attn_to_k.alpha, lora_transformer_single_transformer_blocks_15_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_15_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_15_attn_to_q.alpha, lora_transformer_single_transformer_blocks_15_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_15_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_15_attn_to_v.alpha, lora_transformer_single_transformer_blocks_15_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_15_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_16_attn_to_k.alpha, lora_transformer_single_transformer_blocks_16_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_16_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_16_attn_to_q.alpha, lora_transformer_single_transformer_blocks_16_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_16_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_16_attn_to_v.alpha, lora_transformer_single_transformer_blocks_16_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_16_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_17_attn_to_k.alpha, lora_transformer_single_transformer_blocks_17_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_17_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_17_attn_to_q.alpha, lora_transformer_single_transformer_blocks_17_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_17_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_17_attn_to_v.alpha, lora_transformer_single_transformer_blocks_17_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_17_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_18_attn_to_k.alpha, lora_transformer_single_transformer_blocks_18_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_18_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_18_attn_to_q.alpha, lora_transformer_single_transformer_blocks_18_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_18_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_18_attn_to_v.alpha, lora_transformer_single_transformer_blocks_18_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_18_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_19_attn_to_k.alpha, lora_transformer_single_transformer_blocks_19_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_19_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_19_attn_to_q.alpha, lora_transformer_single_transformer_blocks_19_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_19_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_19_attn_to_v.alpha, lora_transformer_single_transformer_blocks_19_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_19_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_1_attn_to_k.alpha, lora_transformer_single_transformer_blocks_1_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_1_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_1_attn_to_q.alpha, lora_transformer_single_transformer_blocks_1_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_1_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_1_attn_to_v.alpha, lora_transformer_single_transformer_blocks_1_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_1_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_20_attn_to_k.alpha, lora_transformer_single_transformer_blocks_20_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_20_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_20_attn_to_q.alpha, lora_transformer_single_transformer_blocks_20_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_20_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_20_attn_to_v.alpha, lora_transformer_single_transformer_blocks_20_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_20_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_21_attn_to_k.alpha, lora_transformer_single_transformer_blocks_21_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_21_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_21_attn_to_q.alpha, lora_transformer_single_transformer_blocks_21_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_21_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_21_attn_to_v.alpha, lora_transformer_single_transformer_blocks_21_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_21_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_22_attn_to_k.alpha, lora_transformer_single_transformer_blocks_22_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_22_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_22_attn_to_q.alpha, lora_transformer_single_transformer_blocks_22_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_22_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_22_attn_to_v.alpha, lora_transformer_single_transformer_blocks_22_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_22_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_23_attn_to_k.alpha, lora_transformer_single_transformer_blocks_23_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_23_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_23_attn_to_q.alpha, lora_transformer_single_transformer_blocks_23_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_23_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_23_attn_to_v.alpha, lora_transformer_single_transformer_blocks_23_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_23_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_24_attn_to_k.alpha, lora_transformer_single_transformer_blocks_24_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_24_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_24_attn_to_q.alpha, lora_transformer_single_transformer_blocks_24_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_24_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_24_attn_to_v.alpha, lora_transformer_single_transformer_blocks_24_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_24_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_25_attn_to_k.alpha, lora_transformer_single_transformer_blocks_25_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_25_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_25_attn_to_q.alpha, lora_transformer_single_transformer_blocks_25_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_25_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_25_attn_to_v.alpha, lora_transformer_single_transformer_blocks_25_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_25_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_26_attn_to_k.alpha, lora_transformer_single_transformer_blocks_26_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_26_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_26_attn_to_q.alpha, lora_transformer_single_transformer_blocks_26_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_26_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_26_attn_to_v.alpha, lora_transformer_single_transformer_blocks_26_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_26_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_27_attn_to_k.alpha, lora_transformer_single_transformer_blocks_27_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_27_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_27_attn_to_q.alpha, lora_transformer_single_transformer_blocks_27_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_27_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_27_attn_to_v.alpha, lora_transformer_single_transformer_blocks_27_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_27_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_28_attn_to_k.alpha, lora_transformer_single_transformer_blocks_28_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_28_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_28_attn_to_q.alpha, lora_transformer_single_transformer_blocks_28_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_28_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_28_attn_to_v.alpha, lora_transformer_single_transformer_blocks_28_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_28_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_29_attn_to_k.alpha, lora_transformer_single_transformer_blocks_29_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_29_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_29_attn_to_q.alpha, lora_transformer_single_transformer_blocks_29_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_29_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_29_attn_to_v.alpha, lora_transformer_single_transformer_blocks_29_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_29_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_2_attn_to_k.alpha, lora_transformer_single_transformer_blocks_2_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_2_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_2_attn_to_q.alpha, lora_transformer_single_transformer_blocks_2_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_2_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_2_attn_to_v.alpha, lora_transformer_single_transformer_blocks_2_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_2_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_30_attn_to_k.alpha, lora_transformer_single_transformer_blocks_30_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_30_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_30_attn_to_q.alpha, lora_transformer_single_transformer_blocks_30_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_30_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_30_attn_to_v.alpha, lora_transformer_single_transformer_blocks_30_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_30_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_31_attn_to_k.alpha, lora_transformer_single_transformer_blocks_31_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_31_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_31_attn_to_q.alpha, lora_transformer_single_transformer_blocks_31_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_31_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_31_attn_to_v.alpha, lora_transformer_single_transformer_blocks_31_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_31_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_32_attn_to_k.alpha, lora_transformer_single_transformer_blocks_32_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_32_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_32_attn_to_q.alpha, lora_transformer_single_transformer_blocks_32_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_32_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_32_attn_to_v.alpha, lora_transformer_single_transformer_blocks_32_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_32_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_33_attn_to_k.alpha, lora_transformer_single_transformer_blocks_33_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_33_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_33_attn_to_q.alpha, lora_transformer_single_transformer_blocks_33_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_33_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_33_attn_to_v.alpha, lora_transformer_single_transformer_blocks_33_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_33_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_34_attn_to_k.alpha, lora_transformer_single_transformer_blocks_34_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_34_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_34_attn_to_q.alpha, lora_transformer_single_transformer_blocks_34_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_34_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_34_attn_to_v.alpha, lora_transformer_single_transformer_blocks_34_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_34_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_35_attn_to_k.alpha, lora_transformer_single_transformer_blocks_35_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_35_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_35_attn_to_q.alpha, lora_transformer_single_transformer_blocks_35_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_35_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_35_attn_to_v.alpha, lora_transformer_single_transformer_blocks_35_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_35_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_36_attn_to_k.alpha, lora_transformer_single_transformer_blocks_36_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_36_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_36_attn_to_q.alpha, lora_transformer_single_transformer_blocks_36_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_36_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_36_attn_to_v.alpha, lora_transformer_single_transformer_blocks_36_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_36_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_37_attn_to_k.alpha, lora_transformer_single_transformer_blocks_37_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_37_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_37_attn_to_q.alpha, lora_transformer_single_transformer_blocks_37_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_37_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_37_attn_to_v.alpha, lora_transformer_single_transformer_blocks_37_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_37_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_3_attn_to_k.alpha, lora_transformer_single_transformer_blocks_3_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_3_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_3_attn_to_q.alpha, lora_transformer_single_transformer_blocks_3_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_3_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_3_attn_to_v.alpha, lora_transformer_single_transformer_blocks_3_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_3_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_4_attn_to_k.alpha, lora_transformer_single_transformer_blocks_4_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_4_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_4_attn_to_q.alpha, lora_transformer_single_transformer_blocks_4_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_4_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_4_attn_to_v.alpha, lora_transformer_single_transformer_blocks_4_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_4_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_5_attn_to_k.alpha, lora_transformer_single_transformer_blocks_5_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_5_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_5_attn_to_q.alpha, lora_transformer_single_transformer_blocks_5_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_5_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_5_attn_to_v.alpha, lora_transformer_single_transformer_blocks_5_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_5_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_6_attn_to_k.alpha, lora_transformer_single_transformer_blocks_6_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_6_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_6_attn_to_q.alpha, lora_transformer_single_transformer_blocks_6_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_6_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_6_attn_to_v.alpha, lora_transformer_single_transformer_blocks_6_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_6_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_7_attn_to_k.alpha, lora_transformer_single_transformer_blocks_7_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_7_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_7_attn_to_q.alpha, lora_transformer_single_transformer_blocks_7_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_7_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_7_attn_to_v.alpha, lora_transformer_single_transformer_blocks_7_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_7_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_8_attn_to_k.alpha, lora_transformer_single_transformer_blocks_8_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_8_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_8_attn_to_q.alpha, lora_transformer_single_transformer_blocks_8_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_8_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_8_attn_to_v.alpha, lora_transformer_single_transformer_blocks_8_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_8_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_9_attn_to_k.alpha, lora_transformer_single_transformer_blocks_9_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_9_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_9_attn_to_q.alpha, lora_transformer_single_transformer_blocks_9_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_9_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_9_attn_to_v.alpha, lora_transformer_single_transformer_blocks_9_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_9_attn_to_v.lora_up.weight, lora_transformer_transformer_blocks_0_attn_add_k_proj.alpha, lora_transformer_transformer_blocks_0_attn_add_k_proj.lora_down.weight, lora_transformer_transformer_blocks_0_attn_add_k_proj.lora_up.weight, lora_transformer_transformer_blocks_0_attn_add_q_proj.alpha, lora_transformer_transformer_blocks_0_attn_add_q_proj.lora_down.weight, lora_transformer_transformer_blocks_0_attn_add_q_proj.lora_up.weight, lora_transformer_transformer_blocks_0_attn_add_v_proj.alpha, lora_transformer_transformer_blocks_0_attn_add_v_proj.lora_down.weight, lora_transformer_transformer_blocks_0_attn_add_v_proj.lora_up.weight, lora_transformer_transformer_blocks_0_attn_to_add_out.alpha, lora_transformer_transformer_blocks_0_attn_to_add_out.lora_down.weight, lora_transformer_transformer_blocks_0_attn_to_add_out.lora_up.weight, lora_transformer_transformer_blocks_0_attn_to_k.alpha, lora_transformer_transformer_blocks_0_attn_to_k.lora_down.weight, lora_transformer_transformer_blocks_0_attn_to_k.lora_up.weight, lora_transformer_transformer_blocks_0_attn_to_out_0.alpha, lora_transformer_transformer_blocks_0_attn_to_out_0.lora_down.weight, lora_transformer_transformer_blocks_0_attn_to_out_0.lora_up.weight, lora_transformer_transformer_blocks_0_attn_to_q.alpha, lora_transformer_transformer_blocks_0_attn_to_q.lora_down.weight, lora_transformer_transformer_blocks_0_attn_to_q.lora_up.weight, lora_transformer_transformer_blocks_0_attn_to_v.alpha, lora_transformer_transformer_blocks_0_attn_to_v.lora_down.weight, lora_transformer_transformer_blocks_0_attn_to_v.lora_up.weight, lora_transformer_transformer_blocks_10_attn_add_k_proj.alpha, lora_transformer_transformer_blocks_10_attn_add_k_proj.lora_down.weight, lora_transformer_transformer_blocks_10_attn_add_k_proj.lora_up.weight, lora_transformer_transformer_blocks_10_attn_add_q_proj.alpha, lora_transformer_transformer_blocks_10_attn_add_q_proj.lora_down.weight, lora_transformer_transformer_blocks_10_attn_add_q_proj.lora_up.weight, lora_transformer_transformer_blocks_10_attn_add_v_proj.alpha, lora_transformer_transformer_blocks_10_attn_add_v_proj.lora_down.weight, lora_transformer_transformer_blocks_10_attn_add_v_proj.lora_up.weight, lora_transformer_transformer_blocks_10_attn_to_add_out.alpha, lora_transformer_transformer_blocks_10_attn_to_add_out.lora_down.weight, lora_transformer_transformer_blocks_10_attn_to_add_out.lora_up.weight, lora_transformer_transformer_blocks_10_attn_to_k.alpha, lora_transformer_transformer_blocks_10_attn_to_k.lora_down.weight, lora_transformer_transformer_blocks_10_attn_to_k.lora_up.weight, lora_transformer_transformer_blocks_10_attn_to_out_0.alpha, lora_transformer_transformer_blocks_10_attn_to_out_0.lora_down.weight, lora_transformer_transformer_blocks_10_attn_to_out_0.lora_up.weight, lora_transformer_transformer_blocks_10_attn_to_q.alpha, lora_transformer_transformer_blocks_10_attn_to_q.lora_down.weight, lora_transformer_transformer_blocks_10_attn_to_q.lora_up.weight, lora_transformer_transformer_blocks_10_attn_to_v.alpha, lora_transformer_transformer_blocks_10_attn_to_v.lora_down.weight, lora_transformer_transformer_blocks_10_attn_to_v.lora_up.weight, lora_transformer_transformer_blocks_11_attn_add_k_proj.alpha, lora_transformer_transformer_blocks_11_attn_add_k_proj.lora_down.weight, lora_transformer_transformer_blocks_11_attn_add_k_proj.lora_up.weight, lora_transformer_transformer_blocks_11_attn_add_q_proj.alpha, lora_transformer_transformer_blocks_11_attn_add_q_proj.lora_down.weight, lora_transformer_transformer_blocks_11_attn_add_q_proj.lora_up.weight, lora_transformer_transformer_blocks_11_attn_add_v_proj.alpha, lora_transformer_transformer_blocks_11_attn_add_v_proj.lora_down.weight, lora_transformer_transformer_blocks_11_attn_add_v_proj.lora_up.weight, lora_transformer_transformer_blocks_11_attn_to_add_out.alpha, lora_transformer_transformer_blocks_11_attn_to_add_out.lora_down.weight, lora_transformer_transformer_blocks_11_attn_to_add_out.lora_up.weight, lora_transformer_transformer_blocks_11_attn_to_k.alpha, lora_transformer_transformer_blocks_11_attn_to_k.lora_down.weight, lora_transformer_transformer_blocks_11_attn_to_k.lora_up.weight, lora_transformer_transformer_blocks_11_attn_to_out_0.alpha, lora_transformer_transformer_blocks_11_attn_to_out_0.lora_down.weight, lora_transformer_transformer_blocks_11_attn_to_out_0.lora_up.weight, lora_transformer_transformer_blocks_11_attn_to_q.alpha, lora_transformer_transformer_blocks_11_attn_to_q.lora_down.weight, lora_transformer_transformer_blocks_11_attn_to_q.lora_up.weight, lora_transformer_transformer_blocks_11_attn_to_v.alpha, lora_transformer_transformer_blocks_11_attn_to_v.lora_down.weight, lora_transformer_transformer_blocks_11_attn_to_v.lora_up.weight, lora_transformer_transformer_blocks_12_attn_add_k_proj.alpha, lora_transformer_transformer_blocks_12_attn_add_k_proj.lora_down.weight, lora_transformer_transformer_blocks_12_attn_add_k_proj.lora_up.weight, lora_transformer_transformer_blocks_12_attn_add_q_proj.alpha, lora_transformer_transformer_blocks_12_attn_add_q_proj.lora_down.weight, lora_transformer_transformer_blocks_12_attn_add_q_proj.lora_up.weight, lora_transformer_transformer_blocks_12_attn_add_v_proj.alpha, lora_transformer_transformer_blocks_12_attn_add_v_proj.lora_down.weight, lora_transformer_transformer_blocks_12_attn_add_v_proj.lora_up.weight, lora_transformer_transformer_blocks_12_attn_to_add_out.alpha, lora_transformer_transformer_blocks_12_attn_to_add_out.lora_down.weight, lora_transformer_transformer_blocks_12_attn_to_add_out.lora_up.weight, lora_transformer_transformer_blocks_12_attn_to_k.alpha, lora_transformer_transformer_blocks_12_attn_to_k.lora_down.weight, lora_transformer_transformer_blocks_12_attn_to_k.lora_up.weight, lora_transformer_transformer_blocks_12_attn_to_out_0.alpha, lora_transformer_transformer_blocks_12_attn_to_out_0.lora_down.weight, lora_transformer_transformer_blocks_12_attn_to_out_0.lora_up.weight, lora_transformer_transformer_blocks_12_attn_to_q.alpha, lora_transformer_transformer_blocks_12_attn_to_q.lora_down.weight, lora_transformer_transformer_blocks_12_attn_to_q.lora_up.weight, lora_transformer_transformer_blocks_12_attn_to_v.alpha, lora_transformer_transformer_blocks_12_attn_to_v.lora_down.weight, lora_transformer_transformer_blocks_12_attn_to_v.lora_up.weight, lora_transformer_transformer_blocks_13_attn_add_k_proj.alpha, lora_transformer_transformer_blocks_13_attn_add_k_proj.lora_down.weight, lora_transformer_transformer_blocks_13_attn_add_k_proj.lora_up.weight, lora_transformer_transformer_blocks_13_attn_add_q_proj.alpha, lora_transformer_transformer_blocks_13_attn_add_q_proj.lora_down.weight, lora_transformer_transformer_blocks_13_attn_add_q_proj.lora_up.weight, lora_transformer_transformer_blocks_13_attn_add_v_proj.alpha, lora_transformer_transformer_blocks_13_attn_add_v_proj.lora_down.weight, lora_transformer_transformer_blocks_13_attn_add_v_proj.lora_up.weight, lora_transformer_transformer_blocks_13_attn_to_add_out.alpha, lora_transformer_transformer_blocks_13_attn_to_add_out.lora_down.weight, lora_transformer_transformer_blocks_13_attn_to_add_out.lora_up.weight, lora_transformer_transformer_blocks_13_attn_to_k.alpha, lora_transformer_transformer_blocks_13_attn_to_k.lora_down.weight, lora_transformer_transformer_blocks_13_attn_to_k.lora_up.weight, lora_transformer_transformer_blocks_13_attn_to_out_0.alpha, lora_transformer_transformer_blocks_13_attn_to_out_0.lora_down.weight, lora_transformer_transformer_blocks_13_attn_to_out_0.lora_up.weight, lora_transformer_transformer_blocks_13_attn_to_q.alpha, lora_transformer_transformer_blocks_13_attn_to_q.lora_down.weight, lora_transformer_transformer_blocks_13_attn_to_q.lora_up.weight, lora_transformer_transformer_blocks_13_attn_to_v.alpha, lora_transformer_transformer_blocks_13_attn_to_v.lora_down.weight, lora_transformer_transformer_blocks_13_attn_to_v.lora_up.weight, lora_transformer_transformer_blocks_14_attn_add_k_proj.alpha, lora_transformer_transformer_blocks_14_attn_add_k_proj.lora_down.weight, lora_transformer_transformer_blocks_14_attn_add_k_proj.lora_up.weight, lora_transformer_transformer_blocks_14_attn_add_q_proj.alpha, lora_transformer_transformer_blocks_14_attn_add_q_proj.lora_down.weight, lora_transformer_transformer_blocks_14_attn_add_q_proj.lora_up.weight, lora_transformer_transformer_blocks_14_attn_add_v_proj.alpha, lora_transformer_transformer_blocks_14_attn_add_v_proj.lora_down.weight, lora_transformer_transformer_blocks_14_attn_add_v_proj.lora_up.weight, lora_transformer_transformer_blocks_14_attn_to_add_out.alpha, lora_transformer_transformer_blocks_14_attn_to_add_out.lora_down.weight, lora_transformer_transformer_blocks_14_attn_to_add_out.lora_up.weight, lora_transformer_transformer_blocks_14_attn_to_k.alpha, lora_transformer_transformer_blocks_14_attn_to_k.lora_down.weight, lora_transformer_transformer_blocks_14_attn_to_k.lora_up.weight, lora_transformer_transformer_blocks_14_attn_to_out_0.alpha, lora_transformer_transformer_blocks_14_attn_to_out_0.lora_down.weight, lora_transformer_transformer_blocks_14_attn_to_out_0.lora_up.weight, lora_transformer_transformer_blocks_14_attn_to_q.alpha, lora_transformer_transformer_blocks_14_attn_to_q.lora_down.weight, lora_transformer_transformer_blocks_14_attn_to_q.lora_up.weight, lora_transformer_transformer_blocks_14_attn_to_v.alpha, lora_transformer_transformer_blocks_14_attn_to_v.lora_down.weight, lora_transformer_transformer_blocks_14_attn_to_v.lora_up.weight, lora_transformer_transformer_blocks_15_attn_add_k_proj.alpha, lora_transformer_transformer_blocks_15_attn_add_k_proj.lora_down.weight, lora_transformer_transformer_blocks_15_attn_add_k_proj.lora_up.weight, lora_transformer_transformer_blocks_15_attn_add_q_proj.alpha, lora_transformer_transformer_blocks_15_attn_add_q_proj.lora_down.weight, lora_transformer_transformer_blocks_15_attn_add_q_proj.lora_up.weight, lora_transformer_transformer_blocks_15_attn_add_v_proj.alpha, lora_transformer_transformer_blocks_15_attn_add_v_proj.lora_down.weight, lora_transformer_transformer_blocks_15_attn_add_v_proj.lora_up.weight, lora_transformer_transformer_blocks_15_attn_to_add_out.alpha, lora_transformer_transformer_blocks_15_attn_to_add_out.lora_down.weight, lora_transformer_transformer_blocks_15_attn_to_add_out.lora_up.weight, lora_transformer_transformer_blocks_15_attn_to_k.alpha, lora_transformer_transformer_blocks_15_attn_to_k.lora_down.weight, lora_transformer_transformer_blocks_15_attn_to_k.lora_up.weight, lora_transformer_transformer_blocks_15_attn_to_out_0.alpha, lora_transformer_transformer_blocks_15_attn_to_out_0.lora_down.weight, lora_transformer_transformer_blocks_15_attn_to_out_0.lora_up.weight, lora_transformer_transformer_blocks_15_attn_to_q.alpha, lora_transformer_transformer_blocks_15_attn_to_q.lora_down.weight, lora_transformer_transformer_blocks_15_attn_to_q.lora_up.weight, lora_transformer_transformer_blocks_15_attn_to_v.alpha, lora_transformer_transformer_blocks_15_attn_to_v.lora_down.weight, lora_transformer_transformer_blocks_15_attn_to_v.lora_up.weight, lora_transformer_transformer_blocks_16_attn_add_k_proj.alpha, lora_transformer_transformer_blocks_16_attn_add_k_proj.lora_down.weight, lora_transformer_transformer_blocks_16_attn_add_k_proj.lora_up.weight, lora_transformer_transformer_blocks_16_attn_add_q_proj.alpha, lora_transformer_transformer_blocks_16_attn_add_q_proj.lora_down.weight, lora_transformer_transformer_blocks_16_attn_add_q_proj.lora_up.weight, lora_transformer_transformer_blocks_16_attn_add_v_proj.alpha, lora_transformer_transformer_blocks_16_attn_add_v_proj.lora_down.weight, lora_transformer_transformer_blocks_16_attn_add_v_proj.lora_up.weight, lora_transformer_transformer_blocks_16_attn_to_add_out.alpha, lora_transformer_transformer_blocks_16_attn_to_add_out.lora_down.weight, lora_transformer_transformer_blocks_16_attn_to_add_out.lora_up.weight, lora_transformer_transformer_blocks_16_attn_to_k.alpha, lora_transformer_transformer_blocks_16_attn_to_k.lora_down.weight, lora_transformer_transformer_blocks_16_attn_to_k.lora_up.weight, lora_transformer_transformer_blocks_16_attn_to_out_0.alpha, lora_transformer_transformer_blocks_16_attn_to_out_0.lora_down.weight, lora_transformer_transformer_blocks_16_attn_to_out_0.lora_up.weight, lora_transformer_transformer_blocks_16_attn_to_q.alpha, lora_transformer_transformer_blocks_16_attn_to_q.lora_down.weight, lora_transformer_transformer_blocks_16_attn_to_q.lora_up.weight, lora_transformer_transformer_blocks_16_attn_to_v.alpha, lora_transformer_transformer_blocks_16_attn_to_v.lora_down.weight, lora_transformer_transformer_blocks_16_attn_to_v.lora_up.weight, lora_transformer_transformer_blocks_17_attn_add_k_proj.alpha, lora_transformer_transformer_blocks_17_attn_add_k_proj.lora_down.weight, lora_transformer_transformer_blocks_17_attn_add_k_proj.lora_up.weight, lora_transformer_transformer_blocks_17_attn_add_q_proj.alpha, lora_transformer_transformer_blocks_17_attn_add_q_proj.lora_down.weight, lora_transformer_transformer_blocks_17_attn_add_q_proj.lora_up.weight, lora_transformer_transformer_blocks_17_attn_add_v_proj.alpha, lora_transformer_transformer_blocks_17_attn_add_v_proj.lora_down.weight, lora_transformer_transformer_blocks_17_attn_add_v_proj.lora_up.weight, lora_transformer_transformer_blocks_17_attn_to_add_out.alpha, lora_transformer_transformer_blocks_17_attn_to_add_out.lora_down.weight, lora_transformer_transformer_blocks_17_attn_to_add_out.lora_up.weight, lora_transformer_transformer_blocks_17_attn_to_k.alpha, lora_transformer_transformer_blocks_17_attn_to_k.lora_down.weight, lora_transformer_transformer_blocks_17_attn_to_k.lora_up.weight, lora_transformer_transformer_blocks_17_attn_to_out_0.alpha, lora_transformer_transformer_blocks_17_attn_to_out_0.lora_down.weight, lora_transformer_transformer_blocks_17_attn_to_out_0.lora_up.weight, lora_transformer_transformer_blocks_17_attn_to_q.alpha, lora_transformer_transformer_blocks_17_attn_to_q.lora_down.weight, lora_transformer_transformer_blocks_17_attn_to_q.lora_up.weight, lora_transformer_transformer_blocks_17_attn_to_v.alpha, lora_transformer_transformer_blocks_17_attn_to_v.lora_down.weight, lora_transformer_transformer_blocks_17_attn_to_v.lora_up.weight, lora_transformer_transformer_blocks_18_attn_add_k_proj.alpha, lora_transformer_transformer_blocks_18_attn_add_k_proj.lora_down.weight, lora_transformer_transformer_blocks_18_attn_add_k_proj.lora_up.weight, lora_transformer_transformer_blocks_18_attn_add_q_proj.alpha, lora_transformer_transformer_blocks_18_attn_add_q_proj.lora_down.weight, lora_transformer_transformer_blocks_18_attn_add_q_proj.lora_up.weight, lora_transformer_transformer_blocks_18_attn_add_v_proj.alpha, lora_transformer_transformer_blocks_18_attn_add_v_proj.lora_down.weight, lora_transformer_transformer_blocks_18_attn_add_v_proj.lora_up.weight, lora_transformer_transformer_blocks_18_attn_to_add_out.alpha, lora_transformer_transformer_blocks_18_attn_to_add_out.lora_down.weight, lora_transformer_transformer_blocks_18_attn_to_add_out.lora_up.weight, lora_transformer_transformer_blocks_18_attn_to_k.alpha, lora_transformer_transformer_blocks_18_attn_to_k.lora_down.weight, lora_transformer_transformer_blocks_18_attn_to_k.lora_up.weight, lora_transformer_transformer_blocks_18_attn_to_out_0.alpha, lora_transformer_transformer_blocks_18_attn_to_out_0.lora_down.weight, lora_transformer_transformer_blocks_18_attn_to_out_0.lora_up.weight, lora_transformer_transformer_blocks_18_attn_to_q.alpha, lora_transformer_transformer_blocks_18_attn_to_q.lora_down.weight, lora_transformer_transformer_blocks_18_attn_to_q.lora_up.weight, lora_transformer_transformer_blocks_18_attn_to_v.alpha, lora_transformer_transformer_blocks_18_attn_to_v.lora_down.weight, lora_transformer_transformer_blocks_18_attn_to_v.lora_up.weight, lora_transformer_transformer_blocks_1_attn_add_k_proj.alpha, lora_transformer_transformer_blocks_1_attn_add_k_proj.lora_down.weight, lora_transformer_transformer_blocks_1_attn_add_k_proj.lora_up.weight, lora_transformer_transformer_blocks_1_attn_add_q_proj.alpha, lora_transformer_transformer_blocks_1_attn_add_q_proj.lora_down.weight, lora_transformer_transformer_blocks_1_attn_add_q_proj.lora_up.weight, lora_transformer_transformer_blocks_1_attn_add_v_proj.alpha, lora_transformer_transformer_blocks_1_attn_add_v_proj.lora_down.weight, lora_transformer_transformer_blocks_1_attn_add_v_proj.lora_up.weight, lora_transformer_transformer_blocks_1_attn_to_add_out.alpha, lora_transformer_transformer_blocks_1_attn_to_add_out.lora_down.weight, lora_transformer_transformer_blocks_1_attn_to_add_out.lora_up.weight, lora_transformer_transformer_blocks_1_attn_to_k.alpha, lora_transformer_transformer_blocks_1_attn_to_k.lora_down.weight, lora_transformer_transformer_blocks_1_attn_to_k.lora_up.weight, lora_transformer_transformer_blocks_1_attn_to_out_0.alpha, lora_transformer_transformer_blocks_1_attn_to_out_0.lora_down.weight, lora_transformer_transformer_blocks_1_attn_to_out_0.lora_up.weight, lora_transformer_transformer_blocks_1_attn_to_q.alpha, lora_transformer_transformer_blocks_1_attn_to_q.lora_down.weight, lora_transformer_transformer_blocks_1_attn_to_q.lora_up.weight, lora_transformer_transformer_blocks_1_attn_to_v.alpha, lora_transformer_transformer_blocks_1_attn_to_v.lora_down.weight, lora_transformer_transformer_blocks_1_attn_to_v.lora_up.weight, lora_transformer_transformer_blocks_2_attn_add_k_proj.alpha, lora_transformer_transformer_blocks_2_attn_add_k_proj.lora_down.weight, lora_transformer_transformer_blocks_2_attn_add_k_proj.lora_up.weight, lora_transformer_transformer_blocks_2_attn_add_q_proj.alpha, lora_transformer_transformer_blocks_2_attn_add_q_proj.lora_down.weight, lora_transformer_transformer_blocks_2_attn_add_q_proj.lora_up.weight, lora_transformer_transformer_blocks_2_attn_add_v_proj.alpha, lora_transformer_transformer_blocks_2_attn_add_v_proj.lora_down.weight, lora_transformer_transformer_blocks_2_attn_add_v_proj.lora_up.weight, lora_transformer_transformer_blocks_2_attn_to_add_out.alpha, lora_transformer_transformer_blocks_2_attn_to_add_out.lora_down.weight, lora_transformer_transformer_blocks_2_attn_to_add_out.lora_up.weight, lora_transformer_transformer_blocks_2_attn_to_k.alpha, lora_transformer_transformer_blocks_2_attn_to_k.lora_down.weight, lora_transformer_transformer_blocks_2_attn_to_k.lora_up.weight, lora_transformer_transformer_blocks_2_attn_to_out_0.alpha, lora_transformer_transformer_blocks_2_attn_to_out_0.lora_down.weight, lora_transformer_transformer_blocks_2_attn_to_out_0.lora_up.weight, lora_transformer_transformer_blocks_2_attn_to_q.alpha, lora_transformer_transformer_blocks_2_attn_to_q.lora_down.weight, lora_transformer_transformer_blocks_2_attn_to_q.lora_up.weight, lora_transformer_transformer_blocks_2_attn_to_v.alpha, lora_transformer_transformer_blocks_2_attn_to_v.lora_down.weight, lora_transformer_transformer_blocks_2_attn_to_v.lora_up.weight, lora_transformer_transformer_blocks_3_attn_add_k_proj.alpha, lora_transformer_transformer_blocks_3_attn_add_k_proj.lora_down.weight, lora_transformer_transformer_blocks_3_attn_add_k_proj.lora_up.weight, lora_transformer_transformer_blocks_3_attn_add_q_proj.alpha, lora_transformer_transformer_blocks_3_attn_add_q_proj.lora_down.weight, lora_transformer_transformer_blocks_3_attn_add_q_proj.lora_up.weight, lora_transformer_transformer_blocks_3_attn_add_v_proj.alpha, lora_transformer_transformer_blocks_3_attn_add_v_proj.lora_down.weight, lora_transformer_transformer_blocks_3_attn_add_v_proj.lora_up.weight, lora_transformer_transformer_blocks_3_attn_to_add_out.alpha, lora_transformer_transformer_blocks_3_attn_to_add_out.lora_down.weight, lora_transformer_transformer_blocks_3_attn_to_add_out.lora_up.weight, lora_transformer_transformer_blocks_3_attn_to_k.alpha, lora_transformer_transformer_blocks_3_attn_to_k.lora_down.weight, lora_transformer_transformer_blocks_3_attn_to_k.lora_up.weight, lora_transformer_transformer_blocks_3_attn_to_out_0.alpha, lora_transformer_transformer_blocks_3_attn_to_out_0.lora_down.weight, lora_transformer_transformer_blocks_3_attn_to_out_0.lora_up.weight, lora_transformer_transformer_blocks_3_attn_to_q.alpha, lora_transformer_transformer_blocks_3_attn_to_q.lora_down.weight, lora_transformer_transformer_blocks_3_attn_to_q.lora_up.weight, lora_transformer_transformer_blocks_3_attn_to_v.alpha, lora_transformer_transformer_blocks_3_attn_to_v.lora_down.weight, lora_transformer_transformer_blocks_3_attn_to_v.lora_up.weight, lora_transformer_transformer_blocks_4_attn_add_k_proj.alpha, lora_transformer_transformer_blocks_4_attn_add_k_proj.lora_down.weight, lora_transformer_transformer_blocks_4_attn_add_k_proj.lora_up.weight, lora_transformer_transformer_blocks_4_attn_add_q_proj.alpha, lora_transformer_transformer_blocks_4_attn_add_q_proj.lora_down.weight, lora_transformer_transformer_blocks_4_attn_add_q_proj.lora_up.weight, lora_transformer_transformer_blocks_4_attn_add_v_proj.alpha, lora_transformer_transformer_blocks_4_attn_add_v_proj.lora_down.weight, lora_transformer_transformer_blocks_4_attn_add_v_proj.lora_up.weight, lora_transformer_transformer_blocks_4_attn_to_add_out.alpha, lora_transformer_transformer_blocks_4_attn_to_add_out.lora_down.weight, lora_transformer_transformer_blocks_4_attn_to_add_out.lora_up.weight, lora_transformer_transformer_blocks_4_attn_to_k.alpha, lora_transformer_transformer_blocks_4_attn_to_k.lora_down.weight, lora_transformer_transformer_blocks_4_attn_to_k.lora_up.weight, lora_transformer_transformer_blocks_4_attn_to_out_0.alpha, lora_transformer_transformer_blocks_4_attn_to_out_0.lora_down.weight, lora_transformer_transformer_blocks_4_attn_to_out_0.lora_up.weight, lora_transformer_transformer_blocks_4_attn_to_q.alpha, lora_transformer_transformer_blocks_4_attn_to_q.lora_down.weight, lora_transformer_transformer_blocks_4_attn_to_q.lora_up.weight, lora_transformer_transformer_blocks_4_attn_to_v.alpha, lora_transformer_transformer_blocks_4_attn_to_v.lora_down.weight, lora_transformer_transformer_blocks_4_attn_to_v.lora_up.weight, lora_transformer_transformer_blocks_5_attn_add_k_proj.alpha, lora_transformer_transformer_blocks_5_attn_add_k_proj.lora_down.weight, lora_transformer_transformer_blocks_5_attn_add_k_proj.lora_up.weight, lora_transformer_transformer_blocks_5_attn_add_q_proj.alpha, lora_transformer_transformer_blocks_5_attn_add_q_proj.lora_down.weight, lora_transformer_transformer_blocks_5_attn_add_q_proj.lora_up.weight, lora_transformer_transformer_blocks_5_attn_add_v_proj.alpha, lora_transformer_transformer_blocks_5_attn_add_v_proj.lora_down.weight, lora_transformer_transformer_blocks_5_attn_add_v_proj.lora_up.weight, lora_transformer_transformer_blocks_5_attn_to_add_out.alpha, lora_transformer_transformer_blocks_5_attn_to_add_out.lora_down.weight, lora_transformer_transformer_blocks_5_attn_to_add_out.lora_up.weight, lora_transformer_transformer_blocks_5_attn_to_k.alpha, lora_transformer_transformer_blocks_5_attn_to_k.lora_down.weight, lora_transformer_transformer_blocks_5_attn_to_k.lora_up.weight, lora_transformer_transformer_blocks_5_attn_to_out_0.alpha, lora_transformer_transformer_blocks_5_attn_to_out_0.lora_down.weight, lora_transformer_transformer_blocks_5_attn_to_out_0.lora_up.weight, lora_transformer_transformer_blocks_5_attn_to_q.alpha, lora_transformer_transformer_blocks_5_attn_to_q.lora_down.weight, lora_transformer_transformer_blocks_5_attn_to_q.lora_up.weight, lora_transformer_transformer_blocks_5_attn_to_v.alpha, lora_transformer_transformer_blocks_5_attn_to_v.lora_down.weight, lora_transformer_transformer_blocks_5_attn_to_v.lora_up.weight, lora_transformer_transformer_blocks_6_attn_add_k_proj.alpha, lora_transformer_transformer_blocks_6_attn_add_k_proj.lora_down.weight, lora_transformer_transformer_blocks_6_attn_add_k_proj.lora_up.weight, lora_transformer_transformer_blocks_6_attn_add_q_proj.alpha, lora_transformer_transformer_blocks_6_attn_add_q_proj.lora_down.weight, lora_transformer_transformer_blocks_6_attn_add_q_proj.lora_up.weight, lora_transformer_transformer_blocks_6_attn_add_v_proj.alpha, lora_transformer_transformer_blocks_6_attn_add_v_proj.lora_down.weight, lora_transformer_transformer_blocks_6_attn_add_v_proj.lora_up.weight, lora_transformer_transformer_blocks_6_attn_to_add_out.alpha, lora_transformer_transformer_blocks_6_attn_to_add_out.lora_down.weight, lora_transformer_transformer_blocks_6_attn_to_add_out.lora_up.weight, lora_transformer_transformer_blocks_6_attn_to_k.alpha, lora_transformer_transformer_blocks_6_attn_to_k.lora_down.weight, lora_transformer_transformer_blocks_6_attn_to_k.lora_up.weight, lora_transformer_transformer_blocks_6_attn_to_out_0.alpha, lora_transformer_transformer_blocks_6_attn_to_out_0.lora_down.weight, lora_transformer_transformer_blocks_6_attn_to_out_0.lora_up.weight, lora_transformer_transformer_blocks_6_attn_to_q.alpha, lora_transformer_transformer_blocks_6_attn_to_q.lora_down.weight, lora_transformer_transformer_blocks_6_attn_to_q.lora_up.weight, lora_transformer_transformer_blocks_6_attn_to_v.alpha, lora_transformer_transformer_blocks_6_attn_to_v.lora_down.weight, lora_transformer_transformer_blocks_6_attn_to_v.lora_up.weight, lora_transformer_transformer_blocks_7_attn_add_k_proj.alpha, lora_transformer_transformer_blocks_7_attn_add_k_proj.lora_down.weight, lora_transformer_transformer_blocks_7_attn_add_k_proj.lora_up.weight, lora_transformer_transformer_blocks_7_attn_add_q_proj.alpha, lora_transformer_transformer_blocks_7_attn_add_q_proj.lora_down.weight, lora_transformer_transformer_blocks_7_attn_add_q_proj.lora_up.weight, lora_transformer_transformer_blocks_7_attn_add_v_proj.alpha, lora_transformer_transformer_blocks_7_attn_add_v_proj.lora_down.weight, lora_transformer_transformer_blocks_7_attn_add_v_proj.lora_up.weight, lora_transformer_transformer_blocks_7_attn_to_add_out.alpha, lora_transformer_transformer_blocks_7_attn_to_add_out.lora_down.weight, lora_transformer_transformer_blocks_7_attn_to_add_out.lora_up.weight, lora_transformer_transformer_blocks_7_attn_to_k.alpha, lora_transformer_transformer_blocks_7_attn_to_k.lora_down.weight, lora_transformer_transformer_blocks_7_attn_to_k.lora_up.weight, lora_transformer_transformer_blocks_7_attn_to_out_0.alpha, lora_transformer_transformer_blocks_7_attn_to_out_0.lora_down.weight, lora_transformer_transformer_blocks_7_attn_to_out_0.lora_up.weight, lora_transformer_transformer_blocks_7_attn_to_q.alpha, lora_transformer_transformer_blocks_7_attn_to_q.lora_down.weight, lora_transformer_transformer_blocks_7_attn_to_q.lora_up.weight, lora_transformer_transformer_blocks_7_attn_to_v.alpha, lora_transformer_transformer_blocks_7_attn_to_v.lora_down.weight, lora_transformer_transformer_blocks_7_attn_to_v.lora_up.weight, lora_transformer_transformer_blocks_8_attn_add_k_proj.alpha, lora_transformer_transformer_blocks_8_attn_add_k_proj.lora_down.weight, lora_transformer_transformer_blocks_8_attn_add_k_proj.lora_up.weight, lora_transformer_transformer_blocks_8_attn_add_q_proj.alpha, lora_transformer_transformer_blocks_8_attn_add_q_proj.lora_down.weight, lora_transformer_transformer_blocks_8_attn_add_q_proj.lora_up.weight, lora_transformer_transformer_blocks_8_attn_add_v_proj.alpha, lora_transformer_transformer_blocks_8_attn_add_v_proj.lora_down.weight, lora_transformer_transformer_blocks_8_attn_add_v_proj.lora_up.weight, lora_transformer_transformer_blocks_8_attn_to_add_out.alpha, lora_transformer_transformer_blocks_8_attn_to_add_out.lora_down.weight, lora_transformer_transformer_blocks_8_attn_to_add_out.lora_up.weight, lora_transformer_transformer_blocks_8_attn_to_k.alpha, lora_transformer_transformer_blocks_8_attn_to_k.lora_down.weight, lora_transformer_transformer_blocks_8_attn_to_k.lora_up.weight, lora_transformer_transformer_blocks_8_attn_to_out_0.alpha, lora_transformer_transformer_blocks_8_attn_to_out_0.lora_down.weight, lora_transformer_transformer_blocks_8_attn_to_out_0.lora_up.weight, lora_transformer_transformer_blocks_8_attn_to_q.alpha, lora_transformer_transformer_blocks_8_attn_to_q.lora_down.weight, lora_transformer_transformer_blocks_8_attn_to_q.lora_up.weight, lora_transformer_transformer_blocks_8_attn_to_v.alpha, lora_transformer_transformer_blocks_8_attn_to_v.lora_down.weight, lora_transformer_transformer_blocks_8_attn_to_v.lora_up.weight, lora_transformer_transformer_blocks_9_attn_add_k_proj.alpha, lora_transformer_transformer_blocks_9_attn_add_k_proj.lora_down.weight, lora_transformer_transformer_blocks_9_attn_add_k_proj.lora_up.weight, lora_transformer_transformer_blocks_9_attn_add_q_proj.alpha, lora_transformer_transformer_blocks_9_attn_add_q_proj.lora_down.weight, lora_transformer_transformer_blocks_9_attn_add_q_proj.lora_up.weight, lora_transformer_transformer_blocks_9_attn_add_v_proj.alpha, lora_transformer_transformer_blocks_9_attn_add_v_proj.lora_down.weight, lora_transformer_transformer_blocks_9_attn_add_v_proj.lora_up.weight, lora_transformer_transformer_blocks_9_attn_to_add_out.alpha, lora_transformer_transformer_blocks_9_attn_to_add_out.lora_down.weight, lora_transformer_transformer_blocks_9_attn_to_add_out.lora_up.weight, lora_transformer_transformer_blocks_9_attn_to_k.alpha, lora_transformer_transformer_blocks_9_attn_to_k.lora_down.weight, lora_transformer_transformer_blocks_9_attn_to_k.lora_up.weight, lora_transformer_transformer_blocks_9_attn_to_out_0.alpha, lora_transformer_transformer_blocks_9_attn_to_out_0.lora_down.weight, lora_transformer_transformer_blocks_9_attn_to_out_0.lora_up.weight, lora_transformer_transformer_blocks_9_attn_to_q.alpha, lora_transformer_transformer_blocks_9_attn_to_q.lora_down.weight, lora_transformer_transformer_blocks_9_attn_to_q.lora_up.weight, lora_transformer_transformer_blocks_9_attn_to_v.alpha, lora_transformer_transformer_blocks_9_attn_to_v.lora_down.weight, lora_transformer_transformer_blocks_9_attn_to_v.lora_up.weight

System Info

runpod/pytorch:2.4.0-py3.11-cuda12.4.1-devel-ubuntu22.04
Nvidia A40
Latest main diffusers branch

Who can help?

@sayakpaul

@RIOFornium RIOFornium added the bug Something isn't working label Jan 17, 2025
@sayakpaul
Copy link
Member

Will club this in #10532

@sayakpaul sayakpaul self-assigned this Jan 17, 2025
@sayakpaul
Copy link
Member

@RIOFornium can you check #10532? I have pushed some changes to allow loading of the LoRAs you mentioned.

import torch
from diffusers import FluxPipeline

pipe = FluxPipeline.from_pretrained(
    'black-forest-labs/FLUX.1-dev', torch_dtype=torch.bfloat16
)
pipe.load_lora_weights(
    "sayakpaul/different-lora-from-civitai", weight_name="dynamic_poses_v2.safetensors"
)

Works now.

@sayakpaul sayakpaul linked a pull request Jan 18, 2025 that will close this issue
@RIOFornium
Copy link
Author

Hello!
I tested today (two LoRAs), and it is working!
Thank you very much for the fast fix, you are the best :-)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants