High noise lora: key not loaded

#13
by claws-of-death - opened

Hello,

Apologies if this is a trivial issue on my end — I’m fairly new to this. I tried setting up a workflow with this lora, but I keep getting errors stating that keys from the high noise version cannot be loaded. If I remove the high noise version, the errors disappear, and generation completes, but the resulting video is very blurry and noisy.

Here’s what my workflow looks like (model loading part):
image

I’m using the standard model wan22-i2v-14b-fp8-high-scaled. Is this model incompatible with the LoRA I’m using? I also tried a GGUF version of WAN 2.2, but the result was the same.

Thanks very much for any guidance!

lora key not loaded: blocks.0.cross_attn.k.lora_A.default.weight
lora key not loaded: blocks.0.cross_attn.k.lora_B.default.weight
lora key not loaded: blocks.0.cross_attn.o.lora_A.default.weight
lora key not loaded: blocks.0.cross_attn.o.lora_B.default.weight
lora key not loaded: blocks.0.cross_attn.q.lora_A.default.weight
lora key not loaded: blocks.0.cross_attn.q.lora_B.default.weight
lora key not loaded: blocks.0.cross_attn.v.lora_A.default.weight
lora key not loaded: blocks.0.cross_attn.v.lora_B.default.weight
lora key not loaded: blocks.0.ffn.0.lora_A.default.weight
lora key not loaded: blocks.0.ffn.0.lora_B.default.weight
lora key not loaded: blocks.0.ffn.2.lora_A.default.weight
lora key not loaded: blocks.0.ffn.2.lora_B.default.weight
lora key not loaded: blocks.0.self_attn.k.lora_A.default.weight
lora key not loaded: blocks.0.self_attn.k.lora_B.default.weight
lora key not loaded: blocks.0.self_attn.o.lora_A.default.weight
lora key not loaded: blocks.0.self_attn.o.lora_B.default.weight
lora key not loaded: blocks.0.self_attn.q.lora_A.default.weight
lora key not loaded: blocks.0.self_attn.q.lora_B.default.weight
lora key not loaded: blocks.0.self_attn.v.lora_A.default.weight
lora key not loaded: blocks.0.self_attn.v.lora_B.default.weight
lora key not loaded: blocks.1.cross_attn.k.lora_A.default.weight
lora key not loaded: blocks.1.cross_attn.k.lora_B.default.weight
lora key not loaded: blocks.1.cross_attn.o.lora_A.default.weight
lora key not loaded: blocks.1.cross_attn.o.lora_B.default.weight
lora key not loaded: blocks.1.cross_attn.q.lora_A.default.weight
lora key not loaded: blocks.1.cross_attn.q.lora_B.default.weight
lora key not loaded: blocks.1.cross_attn.v.lora_A.default.weight
lora key not loaded: blocks.1.cross_attn.v.lora_B.default.weight
lora key not loaded: blocks.1.ffn.0.lora_A.default.weight
lora key not loaded: blocks.1.ffn.0.lora_B.default.weight
lora key not loaded: blocks.1.ffn.2.lora_A.default.weight
lora key not loaded: blocks.1.ffn.2.lora_B.default.weight
lora key not loaded: blocks.1.self_attn.k.lora_A.default.weight
lora key not loaded: blocks.1.self_attn.k.lora_B.default.weight
lora key not loaded: blocks.1.self_attn.o.lora_A.default.weight
lora key not loaded: blocks.1.self_attn.o.lora_B.default.weight
lora key not loaded: blocks.1.self_attn.q.lora_A.default.weight
lora key not loaded: blocks.1.self_attn.q.lora_B.default.weight
lora key not loaded: blocks.1.self_attn.v.lora_A.default.weight
lora key not loaded: blocks.1.self_attn.v.lora_B.default.weight
lora key not loaded: blocks.10.cross_attn.k.lora_A.default.weight
lora key not loaded: blocks.10.cross_attn.k.lora_B.default.weight
lora key not loaded: blocks.10.cross_attn.o.lora_A.default.weight
lora key not loaded: blocks.10.cross_attn.o.lora_B.default.weight
lora key not loaded: blocks.10.cross_attn.q.lora_A.default.weight
lora key not loaded: blocks.10.cross_attn.q.lora_B.default.weight
lora key not loaded: blocks.10.cross_attn.v.lora_A.default.weight
lora key not loaded: blocks.10.cross_attn.v.lora_B.default.weight
lora key not loaded: blocks.10.ffn.0.lora_A.default.weight
lora key not loaded: blocks.10.ffn.0.lora_B.default.weight
lora key not loaded: blocks.10.ffn.2.lora_A.default.weight
lora key not loaded: blocks.10.ffn.2.lora_B.default.weight
lora key not loaded: blocks.10.self_attn.k.lora_A.default.weight
lora key not loaded: blocks.10.self_attn.k.lora_B.default.weight
lora key not loaded: blocks.10.self_attn.o.lora_A.default.weight
lora key not loaded: blocks.10.self_attn.o.lora_B.default.weight
lora key not loaded: blocks.10.self_attn.q.lora_A.default.weight
lora key not loaded: blocks.10.self_attn.q.lora_B.default.weight
lora key not loaded: blocks.10.self_attn.v.lora_A.default.weight
lora key not loaded: blocks.10.self_attn.v.lora_B.default.weight
lora key not loaded: blocks.11.cross_attn.k.lora_A.default.weight
lora key not loaded: blocks.11.cross_attn.k.lora_B.default.weight
lora key not loaded: blocks.11.cross_attn.o.lora_A.default.weight
lora key not loaded: blocks.11.cross_attn.o.lora_B.default.weight
lora key not loaded: blocks.11.cross_attn.q.lora_A.default.weight
lora key not loaded: blocks.11.cross_attn.q.lora_B.default.weight
lora key not loaded: blocks.11.cross_attn.v.lora_A.default.weight
lora key not loaded: blocks.11.cross_attn.v.lora_B.default.weight
lora key not loaded: blocks.11.ffn.0.lora_A.default.weight
lora key not loaded: blocks.11.ffn.0.lora_B.default.weight
lora key not loaded: blocks.11.ffn.2.lora_A.default.weight
lora key not loaded: blocks.11.ffn.2.lora_B.default.weight
...
etc.

EDIT: It was indeed a stupid issue, got confused from all the different ressources, not sure what lora I was using but obviously not the correct one. After redownloading, the issue was solved, please close.

Getting the same errors hmm. The models are correct. Maybe you changed something else to fix it?

The above model works and fixes The OPs problem for me.

Sign up or log in to comment