lora error

#1
by LSI - opened

Unfortunately, Lora reports an error with the GGUF model:
lora key not loaded: base_model.model.transformer_blocks.0.attn.add_k_proj.lora_A.weight
lora key not loaded: base_model.model.transformer_blocks.0.attn.add_k_proj.lora_B.weight
lora key not loaded: base_model.model.transformer_blocks.0.attn.add_q_proj.lora_A.weight
lora key not loaded: base_model.model.transformer_blocks.0.attn.add_q_proj.lora_B.weight
lora key not loaded: base_model.model.transformer_blocks.0.attn.add_v_proj.lora_A.weight
lora key not loaded: base_model.model.transformer_blocks.0.attn.add_v_proj.lora_B.weight
lora key not loaded: base_model.model.transformer_blocks.0.attn.to_add_out.lora_A.weight
lora key not loaded: base_model.model.transformer_blocks.0.attn.to_add_out.lora_B.weight
lora key not loaded: base_model.model.transformer_blocks.0.attn.to_k.lora_A.weight
lora key not loaded: base_model.model.transformer_blocks.0.attn.to_k.lora_B.weight
lora key not loaded: base_model.model.transformer_blocks.0.attn.to_out.0.lora_A.weight
lora key not loaded: base_model.model.transformer_blocks.0.attn.to_out.0.lora_B.weight
lora key not loaded: base_model.model.transformer_blocks.0.attn.to_q.lora_A.weight
lora key not loaded: base_model.model.transformer_blocks.0.attn.to_q.lora_B.weight
lora key not loaded: base_model.model.transformer_blocks.0.attn.to_v.lora_A.weight

Wuli-Art org

Sorry, this turbo lora is not tested on GGUF model for now. Could you try set fused=False when patching this lora and see if it works

The problem is that the key name mapping is incorrect; it points to "base_model.model" instead of "diffusion_model."

I made a patch and I'm testing it; if it works, I'll upload it right away.

Thanks for adding lory for comfyui gguf. Great job!

Sign up or log in to comment