We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hello Dears!
I tried to open the next models (Flux.1 D): https://civitai.com/models/332248?modelVersionId=1086989 https://civitai.com/models/290836?modelVersionId=981868
But diffusers raised the error - "Incompatible keys detected"
import torch from diffusers import FluxPipeline
pipe = FluxPipeline.from_pretrained('black-forest-labs/FLUX.1-dev', torch_dtype=torch.bfloat16) pipe.load_lora_weights('Dynamic Poses V2.safetensors')
or
pipe = FluxPipeline.from_pretrained('black-forest-labs/FLUX.1-dev', torch_dtype=torch.bfloat16) pipe.load_lora_weights('Multiple Views4030.safetensors')
Incompatible keys detected: lora_transformer_single_transformer_blocks_0_attn_to_k.alpha, lora_transformer_single_transformer_blocks_0_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_0_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_0_attn_to_q.alpha, lora_transformer_single_transformer_blocks_0_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_0_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_0_attn_to_v.alpha, lora_transformer_single_transformer_blocks_0_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_0_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_10_attn_to_k.alpha, lora_transformer_single_transformer_blocks_10_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_10_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_10_attn_to_q.alpha, lora_transformer_single_transformer_blocks_10_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_10_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_10_attn_to_v.alpha, lora_transformer_single_transformer_blocks_10_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_10_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_11_attn_to_k.alpha, lora_transformer_single_transformer_blocks_11_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_11_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_11_attn_to_q.alpha, lora_transformer_single_transformer_blocks_11_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_11_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_11_attn_to_v.alpha, lora_transformer_single_transformer_blocks_11_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_11_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_12_attn_to_k.alpha, lora_transformer_single_transformer_blocks_12_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_12_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_12_attn_to_q.alpha, lora_transformer_single_transformer_blocks_12_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_12_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_12_attn_to_v.alpha, lora_transformer_single_transformer_blocks_12_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_12_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_13_attn_to_k.alpha, lora_transformer_single_transformer_blocks_13_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_13_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_13_attn_to_q.alpha, lora_transformer_single_transformer_blocks_13_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_13_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_13_attn_to_v.alpha, lora_transformer_single_transformer_blocks_13_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_13_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_14_attn_to_k.alpha, lora_transformer_single_transformer_blocks_14_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_14_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_14_attn_to_q.alpha, lora_transformer_single_transformer_blocks_14_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_14_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_14_attn_to_v.alpha, lora_transformer_single_transformer_blocks_14_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_14_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_15_attn_to_k.alpha, lora_transformer_single_transformer_blocks_15_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_15_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_15_attn_to_q.alpha, lora_transformer_single_transformer_blocks_15_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_15_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_15_attn_to_v.alpha, lora_transformer_single_transformer_blocks_15_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_15_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_16_attn_to_k.alpha, lora_transformer_single_transformer_blocks_16_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_16_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_16_attn_to_q.alpha, lora_transformer_single_transformer_blocks_16_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_16_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_16_attn_to_v.alpha, lora_transformer_single_transformer_blocks_16_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_16_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_17_attn_to_k.alpha, lora_transformer_single_transformer_blocks_17_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_17_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_17_attn_to_q.alpha, lora_transformer_single_transformer_blocks_17_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_17_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_17_attn_to_v.alpha, lora_transformer_single_transformer_blocks_17_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_17_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_18_attn_to_k.alpha, lora_transformer_single_transformer_blocks_18_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_18_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_18_attn_to_q.alpha, lora_transformer_single_transformer_blocks_18_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_18_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_18_attn_to_v.alpha, lora_transformer_single_transformer_blocks_18_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_18_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_19_attn_to_k.alpha, lora_transformer_single_transformer_blocks_19_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_19_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_19_attn_to_q.alpha, lora_transformer_single_transformer_blocks_19_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_19_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_19_attn_to_v.alpha, lora_transformer_single_transformer_blocks_19_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_19_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_1_attn_to_k.alpha, lora_transformer_single_transformer_blocks_1_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_1_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_1_attn_to_q.alpha, lora_transformer_single_transformer_blocks_1_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_1_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_1_attn_to_v.alpha, lora_transformer_single_transformer_blocks_1_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_1_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_20_attn_to_k.alpha, lora_transformer_single_transformer_blocks_20_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_20_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_20_attn_to_q.alpha, lora_transformer_single_transformer_blocks_20_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_20_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_20_attn_to_v.alpha, lora_transformer_single_transformer_blocks_20_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_20_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_21_attn_to_k.alpha, lora_transformer_single_transformer_blocks_21_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_21_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_21_attn_to_q.alpha, lora_transformer_single_transformer_blocks_21_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_21_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_21_attn_to_v.alpha, lora_transformer_single_transformer_blocks_21_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_21_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_22_attn_to_k.alpha, lora_transformer_single_transformer_blocks_22_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_22_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_22_attn_to_q.alpha, lora_transformer_single_transformer_blocks_22_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_22_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_22_attn_to_v.alpha, lora_transformer_single_transformer_blocks_22_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_22_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_23_attn_to_k.alpha, lora_transformer_single_transformer_blocks_23_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_23_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_23_attn_to_q.alpha, lora_transformer_single_transformer_blocks_23_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_23_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_23_attn_to_v.alpha, lora_transformer_single_transformer_blocks_23_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_23_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_24_attn_to_k.alpha, lora_transformer_single_transformer_blocks_24_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_24_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_24_attn_to_q.alpha, lora_transformer_single_transformer_blocks_24_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_24_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_24_attn_to_v.alpha, lora_transformer_single_transformer_blocks_24_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_24_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_25_attn_to_k.alpha, lora_transformer_single_transformer_blocks_25_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_25_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_25_attn_to_q.alpha, lora_transformer_single_transformer_blocks_25_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_25_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_25_attn_to_v.alpha, lora_transformer_single_transformer_blocks_25_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_25_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_26_attn_to_k.alpha, lora_transformer_single_transformer_blocks_26_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_26_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_26_attn_to_q.alpha, lora_transformer_single_transformer_blocks_26_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_26_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_26_attn_to_v.alpha, lora_transformer_single_transformer_blocks_26_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_26_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_27_attn_to_k.alpha, lora_transformer_single_transformer_blocks_27_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_27_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_27_attn_to_q.alpha, lora_transformer_single_transformer_blocks_27_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_27_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_27_attn_to_v.alpha, lora_transformer_single_transformer_blocks_27_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_27_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_28_attn_to_k.alpha, lora_transformer_single_transformer_blocks_28_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_28_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_28_attn_to_q.alpha, lora_transformer_single_transformer_blocks_28_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_28_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_28_attn_to_v.alpha, lora_transformer_single_transformer_blocks_28_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_28_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_29_attn_to_k.alpha, lora_transformer_single_transformer_blocks_29_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_29_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_29_attn_to_q.alpha, lora_transformer_single_transformer_blocks_29_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_29_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_29_attn_to_v.alpha, lora_transformer_single_transformer_blocks_29_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_29_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_2_attn_to_k.alpha, lora_transformer_single_transformer_blocks_2_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_2_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_2_attn_to_q.alpha, lora_transformer_single_transformer_blocks_2_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_2_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_2_attn_to_v.alpha, lora_transformer_single_transformer_blocks_2_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_2_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_30_attn_to_k.alpha, lora_transformer_single_transformer_blocks_30_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_30_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_30_attn_to_q.alpha, lora_transformer_single_transformer_blocks_30_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_30_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_30_attn_to_v.alpha, lora_transformer_single_transformer_blocks_30_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_30_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_31_attn_to_k.alpha, lora_transformer_single_transformer_blocks_31_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_31_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_31_attn_to_q.alpha, lora_transformer_single_transformer_blocks_31_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_31_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_31_attn_to_v.alpha, lora_transformer_single_transformer_blocks_31_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_31_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_32_attn_to_k.alpha, lora_transformer_single_transformer_blocks_32_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_32_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_32_attn_to_q.alpha, lora_transformer_single_transformer_blocks_32_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_32_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_32_attn_to_v.alpha, lora_transformer_single_transformer_blocks_32_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_32_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_33_attn_to_k.alpha, lora_transformer_single_transformer_blocks_33_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_33_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_33_attn_to_q.alpha, lora_transformer_single_transformer_blocks_33_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_33_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_33_attn_to_v.alpha, lora_transformer_single_transformer_blocks_33_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_33_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_34_attn_to_k.alpha, lora_transformer_single_transformer_blocks_34_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_34_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_34_attn_to_q.alpha, lora_transformer_single_transformer_blocks_34_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_34_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_34_attn_to_v.alpha, lora_transformer_single_transformer_blocks_34_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_34_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_35_attn_to_k.alpha, lora_transformer_single_transformer_blocks_35_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_35_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_35_attn_to_q.alpha, lora_transformer_single_transformer_blocks_35_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_35_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_35_attn_to_v.alpha, lora_transformer_single_transformer_blocks_35_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_35_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_36_attn_to_k.alpha, lora_transformer_single_transformer_blocks_36_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_36_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_36_attn_to_q.alpha, lora_transformer_single_transformer_blocks_36_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_36_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_36_attn_to_v.alpha, lora_transformer_single_transformer_blocks_36_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_36_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_37_attn_to_k.alpha, lora_transformer_single_transformer_blocks_37_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_37_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_37_attn_to_q.alpha, lora_transformer_single_transformer_blocks_37_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_37_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_37_attn_to_v.alpha, lora_transformer_single_transformer_blocks_37_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_37_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_3_attn_to_k.alpha, lora_transformer_single_transformer_blocks_3_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_3_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_3_attn_to_q.alpha, lora_transformer_single_transformer_blocks_3_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_3_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_3_attn_to_v.alpha, lora_transformer_single_transformer_blocks_3_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_3_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_4_attn_to_k.alpha, lora_transformer_single_transformer_blocks_4_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_4_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_4_attn_to_q.alpha, lora_transformer_single_transformer_blocks_4_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_4_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_4_attn_to_v.alpha, lora_transformer_single_transformer_blocks_4_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_4_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_5_attn_to_k.alpha, lora_transformer_single_transformer_blocks_5_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_5_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_5_attn_to_q.alpha, lora_transformer_single_transformer_blocks_5_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_5_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_5_attn_to_v.alpha, lora_transformer_single_transformer_blocks_5_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_5_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_6_attn_to_k.alpha, lora_transformer_single_transformer_blocks_6_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_6_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_6_attn_to_q.alpha, lora_transformer_single_transformer_blocks_6_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_6_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_6_attn_to_v.alpha, lora_transformer_single_transformer_blocks_6_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_6_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_7_attn_to_k.alpha, lora_transformer_single_transformer_blocks_7_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_7_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_7_attn_to_q.alpha, lora_transformer_single_transformer_blocks_7_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_7_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_7_attn_to_v.alpha, lora_transformer_single_transformer_blocks_7_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_7_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_8_attn_to_k.alpha, lora_transformer_single_transformer_blocks_8_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_8_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_8_attn_to_q.alpha, lora_transformer_single_transformer_blocks_8_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_8_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_8_attn_to_v.alpha, lora_transformer_single_transformer_blocks_8_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_8_attn_to_v.lora_up.weight, lora_transformer_single_transformer_blocks_9_attn_to_k.alpha, lora_transformer_single_transformer_blocks_9_attn_to_k.lora_down.weight, lora_transformer_single_transformer_blocks_9_attn_to_k.lora_up.weight, lora_transformer_single_transformer_blocks_9_attn_to_q.alpha, lora_transformer_single_transformer_blocks_9_attn_to_q.lora_down.weight, lora_transformer_single_transformer_blocks_9_attn_to_q.lora_up.weight, lora_transformer_single_transformer_blocks_9_attn_to_v.alpha, lora_transformer_single_transformer_blocks_9_attn_to_v.lora_down.weight, lora_transformer_single_transformer_blocks_9_attn_to_v.lora_up.weight, lora_transformer_transformer_blocks_0_attn_add_k_proj.alpha, lora_transformer_transformer_blocks_0_attn_add_k_proj.lora_down.weight, lora_transformer_transformer_blocks_0_attn_add_k_proj.lora_up.weight, lora_transformer_transformer_blocks_0_attn_add_q_proj.alpha, lora_transformer_transformer_blocks_0_attn_add_q_proj.lora_down.weight, lora_transformer_transformer_blocks_0_attn_add_q_proj.lora_up.weight, lora_transformer_transformer_blocks_0_attn_add_v_proj.alpha, lora_transformer_transformer_blocks_0_attn_add_v_proj.lora_down.weight, lora_transformer_transformer_blocks_0_attn_add_v_proj.lora_up.weight, lora_transformer_transformer_blocks_0_attn_to_add_out.alpha, lora_transformer_transformer_blocks_0_attn_to_add_out.lora_down.weight, lora_transformer_transformer_blocks_0_attn_to_add_out.lora_up.weight, lora_transformer_transformer_blocks_0_attn_to_k.alpha, lora_transformer_transformer_blocks_0_attn_to_k.lora_down.weight, lora_transformer_transformer_blocks_0_attn_to_k.lora_up.weight, lora_transformer_transformer_blocks_0_attn_to_out_0.alpha, lora_transformer_transformer_blocks_0_attn_to_out_0.lora_down.weight, lora_transformer_transformer_blocks_0_attn_to_out_0.lora_up.weight, lora_transformer_transformer_blocks_0_attn_to_q.alpha, lora_transformer_transformer_blocks_0_attn_to_q.lora_down.weight, lora_transformer_transformer_blocks_0_attn_to_q.lora_up.weight, lora_transformer_transformer_blocks_0_attn_to_v.alpha, lora_transformer_transformer_blocks_0_attn_to_v.lora_down.weight, lora_transformer_transformer_blocks_0_attn_to_v.lora_up.weight, lora_transformer_transformer_blocks_10_attn_add_k_proj.alpha, lora_transformer_transformer_blocks_10_attn_add_k_proj.lora_down.weight, lora_transformer_transformer_blocks_10_attn_add_k_proj.lora_up.weight, lora_transformer_transformer_blocks_10_attn_add_q_proj.alpha, lora_transformer_transformer_blocks_10_attn_add_q_proj.lora_down.weight, lora_transformer_transformer_blocks_10_attn_add_q_proj.lora_up.weight, lora_transformer_transformer_blocks_10_attn_add_v_proj.alpha, lora_transformer_transformer_blocks_10_attn_add_v_proj.lora_down.weight, lora_transformer_transformer_blocks_10_attn_add_v_proj.lora_up.weight, lora_transformer_transformer_blocks_10_attn_to_add_out.alpha, lora_transformer_transformer_blocks_10_attn_to_add_out.lora_down.weight, lora_transformer_transformer_blocks_10_attn_to_add_out.lora_up.weight, lora_transformer_transformer_blocks_10_attn_to_k.alpha, lora_transformer_transformer_blocks_10_attn_to_k.lora_down.weight, lora_transformer_transformer_blocks_10_attn_to_k.lora_up.weight, lora_transformer_transformer_blocks_10_attn_to_out_0.alpha, lora_transformer_transformer_blocks_10_attn_to_out_0.lora_down.weight, lora_transformer_transformer_blocks_10_attn_to_out_0.lora_up.weight, lora_transformer_transformer_blocks_10_attn_to_q.alpha, lora_transformer_transformer_blocks_10_attn_to_q.lora_down.weight, lora_transformer_transformer_blocks_10_attn_to_q.lora_up.weight, lora_transformer_transformer_blocks_10_attn_to_v.alpha, lora_transformer_transformer_blocks_10_attn_to_v.lora_down.weight, lora_transformer_transformer_blocks_10_attn_to_v.lora_up.weight, lora_transformer_transformer_blocks_11_attn_add_k_proj.alpha, lora_transformer_transformer_blocks_11_attn_add_k_proj.lora_down.weight, lora_transformer_transformer_blocks_11_attn_add_k_proj.lora_up.weight, lora_transformer_transformer_blocks_11_attn_add_q_proj.alpha, lora_transformer_transformer_blocks_11_attn_add_q_proj.lora_down.weight, lora_transformer_transformer_blocks_11_attn_add_q_proj.lora_up.weight, lora_transformer_transformer_blocks_11_attn_add_v_proj.alpha, lora_transformer_transformer_blocks_11_attn_add_v_proj.lora_down.weight, lora_transformer_transformer_blocks_11_attn_add_v_proj.lora_up.weight, lora_transformer_transformer_blocks_11_attn_to_add_out.alpha, lora_transformer_transformer_blocks_11_attn_to_add_out.lora_down.weight, lora_transformer_transformer_blocks_11_attn_to_add_out.lora_up.weight, lora_transformer_transformer_blocks_11_attn_to_k.alpha, lora_transformer_transformer_blocks_11_attn_to_k.lora_down.weight, lora_transformer_transformer_blocks_11_attn_to_k.lora_up.weight, lora_transformer_transformer_blocks_11_attn_to_out_0.alpha, lora_transformer_transformer_blocks_11_attn_to_out_0.lora_down.weight, lora_transformer_transformer_blocks_11_attn_to_out_0.lora_up.weight, lora_transformer_transformer_blocks_11_attn_to_q.alpha, lora_transformer_transformer_blocks_11_attn_to_q.lora_down.weight, lora_transformer_transformer_blocks_11_attn_to_q.lora_up.weight, lora_transformer_transformer_blocks_11_attn_to_v.alpha, lora_transformer_transformer_blocks_11_attn_to_v.lora_down.weight, lora_transformer_transformer_blocks_11_attn_to_v.lora_up.weight, lora_transformer_transformer_blocks_12_attn_add_k_proj.alpha, lora_transformer_transformer_blocks_12_attn_add_k_proj.lora_down.weight, lora_transformer_transformer_blocks_12_attn_add_k_proj.lora_up.weight, lora_transformer_transformer_blocks_12_attn_add_q_proj.alpha, lora_transformer_transformer_blocks_12_attn_add_q_proj.lora_down.weight, lora_transformer_transformer_blocks_12_attn_add_q_proj.lora_up.weight, lora_transformer_transformer_blocks_12_attn_add_v_proj.alpha, lora_transformer_transformer_blocks_12_attn_add_v_proj.lora_down.weight, lora_transformer_transformer_blocks_12_attn_add_v_proj.lora_up.weight, lora_transformer_transformer_blocks_12_attn_to_add_out.alpha, lora_transformer_transformer_blocks_12_attn_to_add_out.lora_down.weight, lora_transformer_transformer_blocks_12_attn_to_add_out.lora_up.weight, lora_transformer_transformer_blocks_12_attn_to_k.alpha, lora_transformer_transformer_blocks_12_attn_to_k.lora_down.weight, lora_transformer_transformer_blocks_12_attn_to_k.lora_up.weight, lora_transformer_transformer_blocks_12_attn_to_out_0.alpha, lora_transformer_transformer_blocks_12_attn_to_out_0.lora_down.weight, lora_transformer_transformer_blocks_12_attn_to_out_0.lora_up.weight, lora_transformer_transformer_blocks_12_attn_to_q.alpha, lora_transformer_transformer_blocks_12_attn_to_q.lora_down.weight, lora_transformer_transformer_blocks_12_attn_to_q.lora_up.weight, lora_transformer_transformer_blocks_12_attn_to_v.alpha, lora_transformer_transformer_blocks_12_attn_to_v.lora_down.weight, lora_transformer_transformer_blocks_12_attn_to_v.lora_up.weight, lora_transformer_transformer_blocks_13_attn_add_k_proj.alpha, lora_transformer_transformer_blocks_13_attn_add_k_proj.lora_down.weight, lora_transformer_transformer_blocks_13_attn_add_k_proj.lora_up.weight, lora_transformer_transformer_blocks_13_attn_add_q_proj.alpha, lora_transformer_transformer_blocks_13_attn_add_q_proj.lora_down.weight, lora_transformer_transformer_blocks_13_attn_add_q_proj.lora_up.weight, lora_transformer_transformer_blocks_13_attn_add_v_proj.alpha, lora_transformer_transformer_blocks_13_attn_add_v_proj.lora_down.weight, lora_transformer_transformer_blocks_13_attn_add_v_proj.lora_up.weight, lora_transformer_transformer_blocks_13_attn_to_add_out.alpha, lora_transformer_transformer_blocks_13_attn_to_add_out.lora_down.weight, lora_transformer_transformer_blocks_13_attn_to_add_out.lora_up.weight, lora_transformer_transformer_blocks_13_attn_to_k.alpha, lora_transformer_transformer_blocks_13_attn_to_k.lora_down.weight, lora_transformer_transformer_blocks_13_attn_to_k.lora_up.weight, lora_transformer_transformer_blocks_13_attn_to_out_0.alpha, lora_transformer_transformer_blocks_13_attn_to_out_0.lora_down.weight, lora_transformer_transformer_blocks_13_attn_to_out_0.lora_up.weight, lora_transformer_transformer_blocks_13_attn_to_q.alpha, lora_transformer_transformer_blocks_13_attn_to_q.lora_down.weight, lora_transformer_transformer_blocks_13_attn_to_q.lora_up.weight, lora_transformer_transformer_blocks_13_attn_to_v.alpha, lora_transformer_transformer_blocks_13_attn_to_v.lora_down.weight, lora_transformer_transformer_blocks_13_attn_to_v.lora_up.weight, lora_transformer_transformer_blocks_14_attn_add_k_proj.alpha, lora_transformer_transformer_blocks_14_attn_add_k_proj.lora_down.weight, lora_transformer_transformer_blocks_14_attn_add_k_proj.lora_up.weight, lora_transformer_transformer_blocks_14_attn_add_q_proj.alpha, lora_transformer_transformer_blocks_14_attn_add_q_proj.lora_down.weight, lora_transformer_transformer_blocks_14_attn_add_q_proj.lora_up.weight, lora_transformer_transformer_blocks_14_attn_add_v_proj.alpha, lora_transformer_transformer_blocks_14_attn_add_v_proj.lora_down.weight, lora_transformer_transformer_blocks_14_attn_add_v_proj.lora_up.weight, lora_transformer_transformer_blocks_14_attn_to_add_out.alpha, lora_transformer_transformer_blocks_14_attn_to_add_out.lora_down.weight, lora_transformer_transformer_blocks_14_attn_to_add_out.lora_up.weight, lora_transformer_transformer_blocks_14_attn_to_k.alpha, lora_transformer_transformer_blocks_14_attn_to_k.lora_down.weight, lora_transformer_transformer_blocks_14_attn_to_k.lora_up.weight, lora_transformer_transformer_blocks_14_attn_to_out_0.alpha, lora_transformer_transformer_blocks_14_attn_to_out_0.lora_down.weight, lora_transformer_transformer_blocks_14_attn_to_out_0.lora_up.weight, lora_transformer_transformer_blocks_14_attn_to_q.alpha, lora_transformer_transformer_blocks_14_attn_to_q.lora_down.weight, lora_transformer_transformer_blocks_14_attn_to_q.lora_up.weight, lora_transformer_transformer_blocks_14_attn_to_v.alpha, lora_transformer_transformer_blocks_14_attn_to_v.lora_down.weight, lora_transformer_transformer_blocks_14_attn_to_v.lora_up.weight, lora_transformer_transformer_blocks_15_attn_add_k_proj.alpha, lora_transformer_transformer_blocks_15_attn_add_k_proj.lora_down.weight, lora_transformer_transformer_blocks_15_attn_add_k_proj.lora_up.weight, lora_transformer_transformer_blocks_15_attn_add_q_proj.alpha, lora_transformer_transformer_blocks_15_attn_add_q_proj.lora_down.weight, lora_transformer_transformer_blocks_15_attn_add_q_proj.lora_up.weight, lora_transformer_transformer_blocks_15_attn_add_v_proj.alpha, lora_transformer_transformer_blocks_15_attn_add_v_proj.lora_down.weight, lora_transformer_transformer_blocks_15_attn_add_v_proj.lora_up.weight, lora_transformer_transformer_blocks_15_attn_to_add_out.alpha, lora_transformer_transformer_blocks_15_attn_to_add_out.lora_down.weight, lora_transformer_transformer_blocks_15_attn_to_add_out.lora_up.weight, lora_transformer_transformer_blocks_15_attn_to_k.alpha, lora_transformer_transformer_blocks_15_attn_to_k.lora_down.weight, lora_transformer_transformer_blocks_15_attn_to_k.lora_up.weight, lora_transformer_transformer_blocks_15_attn_to_out_0.alpha, lora_transformer_transformer_blocks_15_attn_to_out_0.lora_down.weight, lora_transformer_transformer_blocks_15_attn_to_out_0.lora_up.weight, lora_transformer_transformer_blocks_15_attn_to_q.alpha, lora_transformer_transformer_blocks_15_attn_to_q.lora_down.weight, lora_transformer_transformer_blocks_15_attn_to_q.lora_up.weight, lora_transformer_transformer_blocks_15_attn_to_v.alpha, lora_transformer_transformer_blocks_15_attn_to_v.lora_down.weight, lora_transformer_transformer_blocks_15_attn_to_v.lora_up.weight, lora_transformer_transformer_blocks_16_attn_add_k_proj.alpha, lora_transformer_transformer_blocks_16_attn_add_k_proj.lora_down.weight, lora_transformer_transformer_blocks_16_attn_add_k_proj.lora_up.weight, lora_transformer_transformer_blocks_16_attn_add_q_proj.alpha, lora_transformer_transformer_blocks_16_attn_add_q_proj.lora_down.weight, lora_transformer_transformer_blocks_16_attn_add_q_proj.lora_up.weight, lora_transformer_transformer_blocks_16_attn_add_v_proj.alpha, lora_transformer_transformer_blocks_16_attn_add_v_proj.lora_down.weight, lora_transformer_transformer_blocks_16_attn_add_v_proj.lora_up.weight, lora_transformer_transformer_blocks_16_attn_to_add_out.alpha, lora_transformer_transformer_blocks_16_attn_to_add_out.lora_down.weight, lora_transformer_transformer_blocks_16_attn_to_add_out.lora_up.weight, lora_transformer_transformer_blocks_16_attn_to_k.alpha, lora_transformer_transformer_blocks_16_attn_to_k.lora_down.weight, lora_transformer_transformer_blocks_16_attn_to_k.lora_up.weight, lora_transformer_transformer_blocks_16_attn_to_out_0.alpha, lora_transformer_transformer_blocks_16_attn_to_out_0.lora_down.weight, lora_transformer_transformer_blocks_16_attn_to_out_0.lora_up.weight, lora_transformer_transformer_blocks_16_attn_to_q.alpha, lora_transformer_transformer_blocks_16_attn_to_q.lora_down.weight, lora_transformer_transformer_blocks_16_attn_to_q.lora_up.weight, lora_transformer_transformer_blocks_16_attn_to_v.alpha, lora_transformer_transformer_blocks_16_attn_to_v.lora_down.weight, lora_transformer_transformer_blocks_16_attn_to_v.lora_up.weight, lora_transformer_transformer_blocks_17_attn_add_k_proj.alpha, lora_transformer_transformer_blocks_17_attn_add_k_proj.lora_down.weight, lora_transformer_transformer_blocks_17_attn_add_k_proj.lora_up.weight, lora_transformer_transformer_blocks_17_attn_add_q_proj.alpha, lora_transformer_transformer_blocks_17_attn_add_q_proj.lora_down.weight, lora_transformer_transformer_blocks_17_attn_add_q_proj.lora_up.weight, lora_transformer_transformer_blocks_17_attn_add_v_proj.alpha, lora_transformer_transformer_blocks_17_attn_add_v_proj.lora_down.weight, lora_transformer_transformer_blocks_17_attn_add_v_proj.lora_up.weight, lora_transformer_transformer_blocks_17_attn_to_add_out.alpha, lora_transformer_transformer_blocks_17_attn_to_add_out.lora_down.weight, lora_transformer_transformer_blocks_17_attn_to_add_out.lora_up.weight, lora_transformer_transformer_blocks_17_attn_to_k.alpha, lora_transformer_transformer_blocks_17_attn_to_k.lora_down.weight, lora_transformer_transformer_blocks_17_attn_to_k.lora_up.weight, lora_transformer_transformer_blocks_17_attn_to_out_0.alpha, lora_transformer_transformer_blocks_17_attn_to_out_0.lora_down.weight, lora_transformer_transformer_blocks_17_attn_to_out_0.lora_up.weight, lora_transformer_transformer_blocks_17_attn_to_q.alpha, lora_transformer_transformer_blocks_17_attn_to_q.lora_down.weight, lora_transformer_transformer_blocks_17_attn_to_q.lora_up.weight, lora_transformer_transformer_blocks_17_attn_to_v.alpha, lora_transformer_transformer_blocks_17_attn_to_v.lora_down.weight, lora_transformer_transformer_blocks_17_attn_to_v.lora_up.weight, lora_transformer_transformer_blocks_18_attn_add_k_proj.alpha, lora_transformer_transformer_blocks_18_attn_add_k_proj.lora_down.weight, lora_transformer_transformer_blocks_18_attn_add_k_proj.lora_up.weight, lora_transformer_transformer_blocks_18_attn_add_q_proj.alpha, lora_transformer_transformer_blocks_18_attn_add_q_proj.lora_down.weight, lora_transformer_transformer_blocks_18_attn_add_q_proj.lora_up.weight, lora_transformer_transformer_blocks_18_attn_add_v_proj.alpha, lora_transformer_transformer_blocks_18_attn_add_v_proj.lora_down.weight, lora_transformer_transformer_blocks_18_attn_add_v_proj.lora_up.weight, lora_transformer_transformer_blocks_18_attn_to_add_out.alpha, lora_transformer_transformer_blocks_18_attn_to_add_out.lora_down.weight, lora_transformer_transformer_blocks_18_attn_to_add_out.lora_up.weight, lora_transformer_transformer_blocks_18_attn_to_k.alpha, lora_transformer_transformer_blocks_18_attn_to_k.lora_down.weight, lora_transformer_transformer_blocks_18_attn_to_k.lora_up.weight, lora_transformer_transformer_blocks_18_attn_to_out_0.alpha, lora_transformer_transformer_blocks_18_attn_to_out_0.lora_down.weight, lora_transformer_transformer_blocks_18_attn_to_out_0.lora_up.weight, lora_transformer_transformer_blocks_18_attn_to_q.alpha, lora_transformer_transformer_blocks_18_attn_to_q.lora_down.weight, lora_transformer_transformer_blocks_18_attn_to_q.lora_up.weight, lora_transformer_transformer_blocks_18_attn_to_v.alpha, lora_transformer_transformer_blocks_18_attn_to_v.lora_down.weight, lora_transformer_transformer_blocks_18_attn_to_v.lora_up.weight, lora_transformer_transformer_blocks_1_attn_add_k_proj.alpha, lora_transformer_transformer_blocks_1_attn_add_k_proj.lora_down.weight, lora_transformer_transformer_blocks_1_attn_add_k_proj.lora_up.weight, lora_transformer_transformer_blocks_1_attn_add_q_proj.alpha, lora_transformer_transformer_blocks_1_attn_add_q_proj.lora_down.weight, lora_transformer_transformer_blocks_1_attn_add_q_proj.lora_up.weight, lora_transformer_transformer_blocks_1_attn_add_v_proj.alpha, lora_transformer_transformer_blocks_1_attn_add_v_proj.lora_down.weight, lora_transformer_transformer_blocks_1_attn_add_v_proj.lora_up.weight, lora_transformer_transformer_blocks_1_attn_to_add_out.alpha, lora_transformer_transformer_blocks_1_attn_to_add_out.lora_down.weight, lora_transformer_transformer_blocks_1_attn_to_add_out.lora_up.weight, lora_transformer_transformer_blocks_1_attn_to_k.alpha, lora_transformer_transformer_blocks_1_attn_to_k.lora_down.weight, lora_transformer_transformer_blocks_1_attn_to_k.lora_up.weight, lora_transformer_transformer_blocks_1_attn_to_out_0.alpha, lora_transformer_transformer_blocks_1_attn_to_out_0.lora_down.weight, lora_transformer_transformer_blocks_1_attn_to_out_0.lora_up.weight, lora_transformer_transformer_blocks_1_attn_to_q.alpha, lora_transformer_transformer_blocks_1_attn_to_q.lora_down.weight, lora_transformer_transformer_blocks_1_attn_to_q.lora_up.weight, lora_transformer_transformer_blocks_1_attn_to_v.alpha, lora_transformer_transformer_blocks_1_attn_to_v.lora_down.weight, lora_transformer_transformer_blocks_1_attn_to_v.lora_up.weight, lora_transformer_transformer_blocks_2_attn_add_k_proj.alpha, lora_transformer_transformer_blocks_2_attn_add_k_proj.lora_down.weight, lora_transformer_transformer_blocks_2_attn_add_k_proj.lora_up.weight, lora_transformer_transformer_blocks_2_attn_add_q_proj.alpha, lora_transformer_transformer_blocks_2_attn_add_q_proj.lora_down.weight, lora_transformer_transformer_blocks_2_attn_add_q_proj.lora_up.weight, lora_transformer_transformer_blocks_2_attn_add_v_proj.alpha, lora_transformer_transformer_blocks_2_attn_add_v_proj.lora_down.weight, lora_transformer_transformer_blocks_2_attn_add_v_proj.lora_up.weight, lora_transformer_transformer_blocks_2_attn_to_add_out.alpha, lora_transformer_transformer_blocks_2_attn_to_add_out.lora_down.weight, lora_transformer_transformer_blocks_2_attn_to_add_out.lora_up.weight, lora_transformer_transformer_blocks_2_attn_to_k.alpha, lora_transformer_transformer_blocks_2_attn_to_k.lora_down.weight, lora_transformer_transformer_blocks_2_attn_to_k.lora_up.weight, lora_transformer_transformer_blocks_2_attn_to_out_0.alpha, lora_transformer_transformer_blocks_2_attn_to_out_0.lora_down.weight, lora_transformer_transformer_blocks_2_attn_to_out_0.lora_up.weight, lora_transformer_transformer_blocks_2_attn_to_q.alpha, lora_transformer_transformer_blocks_2_attn_to_q.lora_down.weight, lora_transformer_transformer_blocks_2_attn_to_q.lora_up.weight, lora_transformer_transformer_blocks_2_attn_to_v.alpha, lora_transformer_transformer_blocks_2_attn_to_v.lora_down.weight, lora_transformer_transformer_blocks_2_attn_to_v.lora_up.weight, lora_transformer_transformer_blocks_3_attn_add_k_proj.alpha, lora_transformer_transformer_blocks_3_attn_add_k_proj.lora_down.weight, lora_transformer_transformer_blocks_3_attn_add_k_proj.lora_up.weight, lora_transformer_transformer_blocks_3_attn_add_q_proj.alpha, lora_transformer_transformer_blocks_3_attn_add_q_proj.lora_down.weight, lora_transformer_transformer_blocks_3_attn_add_q_proj.lora_up.weight, lora_transformer_transformer_blocks_3_attn_add_v_proj.alpha, lora_transformer_transformer_blocks_3_attn_add_v_proj.lora_down.weight, lora_transformer_transformer_blocks_3_attn_add_v_proj.lora_up.weight, lora_transformer_transformer_blocks_3_attn_to_add_out.alpha, lora_transformer_transformer_blocks_3_attn_to_add_out.lora_down.weight, lora_transformer_transformer_blocks_3_attn_to_add_out.lora_up.weight, lora_transformer_transformer_blocks_3_attn_to_k.alpha, lora_transformer_transformer_blocks_3_attn_to_k.lora_down.weight, lora_transformer_transformer_blocks_3_attn_to_k.lora_up.weight, lora_transformer_transformer_blocks_3_attn_to_out_0.alpha, lora_transformer_transformer_blocks_3_attn_to_out_0.lora_down.weight, lora_transformer_transformer_blocks_3_attn_to_out_0.lora_up.weight, lora_transformer_transformer_blocks_3_attn_to_q.alpha, lora_transformer_transformer_blocks_3_attn_to_q.lora_down.weight, lora_transformer_transformer_blocks_3_attn_to_q.lora_up.weight, lora_transformer_transformer_blocks_3_attn_to_v.alpha, lora_transformer_transformer_blocks_3_attn_to_v.lora_down.weight, lora_transformer_transformer_blocks_3_attn_to_v.lora_up.weight, lora_transformer_transformer_blocks_4_attn_add_k_proj.alpha, lora_transformer_transformer_blocks_4_attn_add_k_proj.lora_down.weight, lora_transformer_transformer_blocks_4_attn_add_k_proj.lora_up.weight, lora_transformer_transformer_blocks_4_attn_add_q_proj.alpha, lora_transformer_transformer_blocks_4_attn_add_q_proj.lora_down.weight, lora_transformer_transformer_blocks_4_attn_add_q_proj.lora_up.weight, lora_transformer_transformer_blocks_4_attn_add_v_proj.alpha, lora_transformer_transformer_blocks_4_attn_add_v_proj.lora_down.weight, lora_transformer_transformer_blocks_4_attn_add_v_proj.lora_up.weight, lora_transformer_transformer_blocks_4_attn_to_add_out.alpha, lora_transformer_transformer_blocks_4_attn_to_add_out.lora_down.weight, lora_transformer_transformer_blocks_4_attn_to_add_out.lora_up.weight, lora_transformer_transformer_blocks_4_attn_to_k.alpha, lora_transformer_transformer_blocks_4_attn_to_k.lora_down.weight, lora_transformer_transformer_blocks_4_attn_to_k.lora_up.weight, lora_transformer_transformer_blocks_4_attn_to_out_0.alpha, lora_transformer_transformer_blocks_4_attn_to_out_0.lora_down.weight, lora_transformer_transformer_blocks_4_attn_to_out_0.lora_up.weight, lora_transformer_transformer_blocks_4_attn_to_q.alpha, lora_transformer_transformer_blocks_4_attn_to_q.lora_down.weight, lora_transformer_transformer_blocks_4_attn_to_q.lora_up.weight, lora_transformer_transformer_blocks_4_attn_to_v.alpha, lora_transformer_transformer_blocks_4_attn_to_v.lora_down.weight, lora_transformer_transformer_blocks_4_attn_to_v.lora_up.weight, lora_transformer_transformer_blocks_5_attn_add_k_proj.alpha, lora_transformer_transformer_blocks_5_attn_add_k_proj.lora_down.weight, lora_transformer_transformer_blocks_5_attn_add_k_proj.lora_up.weight, lora_transformer_transformer_blocks_5_attn_add_q_proj.alpha, lora_transformer_transformer_blocks_5_attn_add_q_proj.lora_down.weight, lora_transformer_transformer_blocks_5_attn_add_q_proj.lora_up.weight, lora_transformer_transformer_blocks_5_attn_add_v_proj.alpha, lora_transformer_transformer_blocks_5_attn_add_v_proj.lora_down.weight, lora_transformer_transformer_blocks_5_attn_add_v_proj.lora_up.weight, lora_transformer_transformer_blocks_5_attn_to_add_out.alpha, lora_transformer_transformer_blocks_5_attn_to_add_out.lora_down.weight, lora_transformer_transformer_blocks_5_attn_to_add_out.lora_up.weight, lora_transformer_transformer_blocks_5_attn_to_k.alpha, lora_transformer_transformer_blocks_5_attn_to_k.lora_down.weight, lora_transformer_transformer_blocks_5_attn_to_k.lora_up.weight, lora_transformer_transformer_blocks_5_attn_to_out_0.alpha, lora_transformer_transformer_blocks_5_attn_to_out_0.lora_down.weight, lora_transformer_transformer_blocks_5_attn_to_out_0.lora_up.weight, lora_transformer_transformer_blocks_5_attn_to_q.alpha, lora_transformer_transformer_blocks_5_attn_to_q.lora_down.weight, lora_transformer_transformer_blocks_5_attn_to_q.lora_up.weight, lora_transformer_transformer_blocks_5_attn_to_v.alpha, lora_transformer_transformer_blocks_5_attn_to_v.lora_down.weight, lora_transformer_transformer_blocks_5_attn_to_v.lora_up.weight, lora_transformer_transformer_blocks_6_attn_add_k_proj.alpha, lora_transformer_transformer_blocks_6_attn_add_k_proj.lora_down.weight, lora_transformer_transformer_blocks_6_attn_add_k_proj.lora_up.weight, lora_transformer_transformer_blocks_6_attn_add_q_proj.alpha, lora_transformer_transformer_blocks_6_attn_add_q_proj.lora_down.weight, lora_transformer_transformer_blocks_6_attn_add_q_proj.lora_up.weight, lora_transformer_transformer_blocks_6_attn_add_v_proj.alpha, lora_transformer_transformer_blocks_6_attn_add_v_proj.lora_down.weight, lora_transformer_transformer_blocks_6_attn_add_v_proj.lora_up.weight, lora_transformer_transformer_blocks_6_attn_to_add_out.alpha, lora_transformer_transformer_blocks_6_attn_to_add_out.lora_down.weight, lora_transformer_transformer_blocks_6_attn_to_add_out.lora_up.weight, lora_transformer_transformer_blocks_6_attn_to_k.alpha, lora_transformer_transformer_blocks_6_attn_to_k.lora_down.weight, lora_transformer_transformer_blocks_6_attn_to_k.lora_up.weight, lora_transformer_transformer_blocks_6_attn_to_out_0.alpha, lora_transformer_transformer_blocks_6_attn_to_out_0.lora_down.weight, lora_transformer_transformer_blocks_6_attn_to_out_0.lora_up.weight, lora_transformer_transformer_blocks_6_attn_to_q.alpha, lora_transformer_transformer_blocks_6_attn_to_q.lora_down.weight, lora_transformer_transformer_blocks_6_attn_to_q.lora_up.weight, lora_transformer_transformer_blocks_6_attn_to_v.alpha, lora_transformer_transformer_blocks_6_attn_to_v.lora_down.weight, lora_transformer_transformer_blocks_6_attn_to_v.lora_up.weight, lora_transformer_transformer_blocks_7_attn_add_k_proj.alpha, lora_transformer_transformer_blocks_7_attn_add_k_proj.lora_down.weight, lora_transformer_transformer_blocks_7_attn_add_k_proj.lora_up.weight, lora_transformer_transformer_blocks_7_attn_add_q_proj.alpha, lora_transformer_transformer_blocks_7_attn_add_q_proj.lora_down.weight, lora_transformer_transformer_blocks_7_attn_add_q_proj.lora_up.weight, lora_transformer_transformer_blocks_7_attn_add_v_proj.alpha, lora_transformer_transformer_blocks_7_attn_add_v_proj.lora_down.weight, lora_transformer_transformer_blocks_7_attn_add_v_proj.lora_up.weight, lora_transformer_transformer_blocks_7_attn_to_add_out.alpha, lora_transformer_transformer_blocks_7_attn_to_add_out.lora_down.weight, lora_transformer_transformer_blocks_7_attn_to_add_out.lora_up.weight, lora_transformer_transformer_blocks_7_attn_to_k.alpha, lora_transformer_transformer_blocks_7_attn_to_k.lora_down.weight, lora_transformer_transformer_blocks_7_attn_to_k.lora_up.weight, lora_transformer_transformer_blocks_7_attn_to_out_0.alpha, lora_transformer_transformer_blocks_7_attn_to_out_0.lora_down.weight, lora_transformer_transformer_blocks_7_attn_to_out_0.lora_up.weight, lora_transformer_transformer_blocks_7_attn_to_q.alpha, lora_transformer_transformer_blocks_7_attn_to_q.lora_down.weight, lora_transformer_transformer_blocks_7_attn_to_q.lora_up.weight, lora_transformer_transformer_blocks_7_attn_to_v.alpha, lora_transformer_transformer_blocks_7_attn_to_v.lora_down.weight, lora_transformer_transformer_blocks_7_attn_to_v.lora_up.weight, lora_transformer_transformer_blocks_8_attn_add_k_proj.alpha, lora_transformer_transformer_blocks_8_attn_add_k_proj.lora_down.weight, lora_transformer_transformer_blocks_8_attn_add_k_proj.lora_up.weight, lora_transformer_transformer_blocks_8_attn_add_q_proj.alpha, lora_transformer_transformer_blocks_8_attn_add_q_proj.lora_down.weight, lora_transformer_transformer_blocks_8_attn_add_q_proj.lora_up.weight, lora_transformer_transformer_blocks_8_attn_add_v_proj.alpha, lora_transformer_transformer_blocks_8_attn_add_v_proj.lora_down.weight, lora_transformer_transformer_blocks_8_attn_add_v_proj.lora_up.weight, lora_transformer_transformer_blocks_8_attn_to_add_out.alpha, lora_transformer_transformer_blocks_8_attn_to_add_out.lora_down.weight, lora_transformer_transformer_blocks_8_attn_to_add_out.lora_up.weight, lora_transformer_transformer_blocks_8_attn_to_k.alpha, lora_transformer_transformer_blocks_8_attn_to_k.lora_down.weight, lora_transformer_transformer_blocks_8_attn_to_k.lora_up.weight, lora_transformer_transformer_blocks_8_attn_to_out_0.alpha, lora_transformer_transformer_blocks_8_attn_to_out_0.lora_down.weight, lora_transformer_transformer_blocks_8_attn_to_out_0.lora_up.weight, lora_transformer_transformer_blocks_8_attn_to_q.alpha, lora_transformer_transformer_blocks_8_attn_to_q.lora_down.weight, lora_transformer_transformer_blocks_8_attn_to_q.lora_up.weight, lora_transformer_transformer_blocks_8_attn_to_v.alpha, lora_transformer_transformer_blocks_8_attn_to_v.lora_down.weight, lora_transformer_transformer_blocks_8_attn_to_v.lora_up.weight, lora_transformer_transformer_blocks_9_attn_add_k_proj.alpha, lora_transformer_transformer_blocks_9_attn_add_k_proj.lora_down.weight, lora_transformer_transformer_blocks_9_attn_add_k_proj.lora_up.weight, lora_transformer_transformer_blocks_9_attn_add_q_proj.alpha, lora_transformer_transformer_blocks_9_attn_add_q_proj.lora_down.weight, lora_transformer_transformer_blocks_9_attn_add_q_proj.lora_up.weight, lora_transformer_transformer_blocks_9_attn_add_v_proj.alpha, lora_transformer_transformer_blocks_9_attn_add_v_proj.lora_down.weight, lora_transformer_transformer_blocks_9_attn_add_v_proj.lora_up.weight, lora_transformer_transformer_blocks_9_attn_to_add_out.alpha, lora_transformer_transformer_blocks_9_attn_to_add_out.lora_down.weight, lora_transformer_transformer_blocks_9_attn_to_add_out.lora_up.weight, lora_transformer_transformer_blocks_9_attn_to_k.alpha, lora_transformer_transformer_blocks_9_attn_to_k.lora_down.weight, lora_transformer_transformer_blocks_9_attn_to_k.lora_up.weight, lora_transformer_transformer_blocks_9_attn_to_out_0.alpha, lora_transformer_transformer_blocks_9_attn_to_out_0.lora_down.weight, lora_transformer_transformer_blocks_9_attn_to_out_0.lora_up.weight, lora_transformer_transformer_blocks_9_attn_to_q.alpha, lora_transformer_transformer_blocks_9_attn_to_q.lora_down.weight, lora_transformer_transformer_blocks_9_attn_to_q.lora_up.weight, lora_transformer_transformer_blocks_9_attn_to_v.alpha, lora_transformer_transformer_blocks_9_attn_to_v.lora_down.weight, lora_transformer_transformer_blocks_9_attn_to_v.lora_up.weight
runpod/pytorch:2.4.0-py3.11-cuda12.4.1-devel-ubuntu22.04 Nvidia A40 Latest main diffusers branch
@sayakpaul
The text was updated successfully, but these errors were encountered:
Will club this in #10532
Sorry, something went wrong.
@RIOFornium can you check #10532? I have pushed some changes to allow loading of the LoRAs you mentioned.
import torch from diffusers import FluxPipeline pipe = FluxPipeline.from_pretrained( 'black-forest-labs/FLUX.1-dev', torch_dtype=torch.bfloat16 ) pipe.load_lora_weights( "sayakpaul/different-lora-from-civitai", weight_name="dynamic_poses_v2.safetensors" )
Works now.
Hello! I tested today (two LoRAs), and it is working! Thank you very much for the fast fix, you are the best :-)
sayakpaul
Successfully merging a pull request may close this issue.
Describe the bug
Hello Dears!
I tried to open the next models (Flux.1 D):
https://civitai.com/models/332248?modelVersionId=1086989
https://civitai.com/models/290836?modelVersionId=981868
But diffusers raised the error - "Incompatible keys detected"
Reproduction
import torch
from diffusers import FluxPipeline
pipe = FluxPipeline.from_pretrained('black-forest-labs/FLUX.1-dev', torch_dtype=torch.bfloat16)
pipe.load_lora_weights('Dynamic Poses V2.safetensors')
or
import torch
from diffusers import FluxPipeline
pipe = FluxPipeline.from_pretrained('black-forest-labs/FLUX.1-dev', torch_dtype=torch.bfloat16)
pipe.load_lora_weights('Multiple Views4030.safetensors')
Logs
System Info
runpod/pytorch:2.4.0-py3.11-cuda12.4.1-devel-ubuntu22.04
Nvidia A40
Latest main diffusers branch
Who can help?
@sayakpaul
The text was updated successfully, but these errors were encountered: