lora key not loaded: **gets spammed with default lora loader.
When I run the 32rank lora with the int4 rank32 version of nunchaku, it gives this large error.
The models I am using
Diffusion Model:svdq-int4_r32-qwen-image-edit-2509.safetensors
Lora:svdq-int4_r32-qwen-image-edit-2509-lightningv2.0-4steps.safetensors
The error
lora key not loaded: transformer_blocks.8.attn.to_add_out.smooth_factor_orig
lora key not loaded: transformer_blocks.8.attn.to_add_out.wscales
lora key not loaded: transformer_blocks.8.attn.to_out.0.bias
lora key not loaded: transformer_blocks.8.attn.to_out.0.proj_down
lora key not loaded: transformer_blocks.8.attn.to_out.0.proj_up
lora key not loaded: transformer_blocks.8.attn.to_out.0.qweight
lora key not loaded: transformer_blocks.8.attn.to_out.0.smooth_factor
lora key not loaded: transformer_blocks.8.attn.to_out.0.smooth_factor_orig
lora key not loaded: transformer_blocks.8.attn.to_out.0.wscales
lora key not loaded: transformer_blocks.8.attn.to_qkv.bias
lora key not loaded: transformer_blocks.8.attn.to_qkv.proj_down
lora key not loaded: transformer_blocks.8.attn.to_qkv.proj_up
lora key not loaded: transformer_blocks.8.attn.to_qkv.qweight
lora key not loaded: transformer_blocks.8.attn.to_qkv.smooth_factor
lora key not loaded: transformer_blocks.8.attn.to_qkv.smooth_factor_orig
lora key not loaded: transformer_blocks.8.attn.to_qkv.wscales
lora key not loaded: transformer_blocks.8.img_mlp.net.0.proj.bias
lora key not loaded: transformer_blocks.8.img_mlp.net.0.proj.proj_down
lora key not loaded: transformer_blocks.8.img_mlp.net.0.proj.proj_up
lora key not loaded: transformer_blocks.8.img_mlp.net.0.proj.qweight
lora key not loaded: transformer_blocks.8.img_mlp.net.0.proj.smooth_factor
lora key not loaded: transformer_blocks.8.img_mlp.net.0.proj.smooth_factor_orig
lora key not loaded: transformer_blocks.8.img_mlp.net.0.proj.wscales
lora key not loaded: transformer_blocks.8.img_mlp.net.2.bias
lora key not loaded: transformer_blocks.8.img_mlp.net.2.proj_down
lora key not loaded: transformer_blocks.8.img_mlp.net.2.proj_up
lora key not loaded: transformer_blocks.8.img_mlp.net.2.qweight
lora key not loaded: transformer_blocks.8.img_mlp.net.2.smooth_factor
lora key not loaded: transformer_blocks.8.img_mlp.net.2.smooth_factor_orig
lora key not loaded: transformer_blocks.8.img_mlp.net.2.wscales
lora key not loaded: transformer_blocks.8.img_mod.1.bias
lora key not loaded: transformer_blocks.8.img_mod.1.qweight
lora key not loaded: transformer_blocks.8.img_mod.1.wscales
lora key not loaded: transformer_blocks.8.img_mod.1.wzeros
lora key not loaded: transformer_blocks.8.txt_mlp.net.0.proj.bias
lora key not loaded: transformer_blocks.8.txt_mlp.net.0.proj.proj_down
lora key not loaded: transformer_blocks.8.txt_mlp.net.0.proj.proj_up
lora key not loaded: transformer_blocks.8.txt_mlp.net.0.proj.qweight
lora key not loaded: transformer_blocks.8.txt_mlp.net.0.proj.smooth_factor
lora key not loaded: transformer_blocks.8.txt_mlp.net.0.proj.smooth_factor_orig
lora key not loaded: transformer_blocks.8.txt_mlp.net.0.proj.wscales
lora key not loaded: transformer_blocks.8.txt_mlp.net.2.bias
lora key not loaded: transformer_blocks.8.txt_mlp.net.2.proj_down
lora key not loaded: transformer_blocks.8.txt_mlp.net.2.proj_up
lora key not loaded: transformer_blocks.8.txt_mlp.net.2.qweight
lora key not loaded: transformer_blocks.8.txt_mlp.net.2.smooth_factor
lora key not loaded: transformer_blocks.8.txt_mlp.net.2.smooth_factor_orig
lora key not loaded: transformer_blocks.8.txt_mlp.net.2.wscales
lora key not loaded: transformer_blocks.8.txt_mod.1.bias
lora key not loaded: transformer_blocks.8.txt_mod.1.qweight
lora key not loaded: transformer_blocks.8.txt_mod.1.wscales
lora key not loaded: transformer_blocks.8.txt_mod.1.wzeros
lora key not loaded: transformer_blocks.9.attn.add_qkv_proj.bias
lora key not loaded: transformer_blocks.9.attn.add_qkv_proj.proj_down
lora key not loaded: transformer_blocks.9.attn.add_qkv_proj.proj_up
lora key not loaded: transformer_blocks.9.attn.add_qkv_proj.qweight
lora key not loaded: transformer_blocks.9.attn.add_qkv_proj.smooth_factor
lora key not loaded: transformer_blocks.9.attn.add_qkv_proj.smooth_factor_orig
lora key not loaded: transformer_blocks.9.attn.add_qkv_proj.wscales
lora key not loaded: transformer_blocks.9.attn.norm_added_k.weight
lora key not loaded: transformer_blocks.9.attn.norm_added_q.weight
lora key not loaded: transformer_blocks.9.attn.norm_k.weight
lora key not loaded: transformer_blocks.9.attn.norm_q.weight
lora key not loaded: transformer_blocks.9.attn.to_add_out.bias
lora key not loaded: transformer_blocks.9.attn.to_add_out.proj_down
lora key not loaded: transformer_blocks.9.attn.to_add_out.proj_up
lora key not loaded: transformer_blocks.9.attn.to_add_out.qweight
lora key not loaded: transformer_blocks.9.attn.to_add_out.smooth_factor
lora key not loaded: transformer_blocks.9.attn.to_add_out.smooth_factor_orig
lora key not loaded: transformer_blocks.9.attn.to_add_out.wscales
lora key not loaded: transformer_blocks.9.attn.to_out.0.bias
lora key not loaded: transformer_blocks.9.attn.to_out.0.proj_down
lora key not loaded: transformer_blocks.9.attn.to_out.0.proj_up
lora key not loaded: transformer_blocks.9.attn.to_out.0.qweight
lora key not loaded: transformer_blocks.9.attn.to_out.0.smooth_factor
lora key not loaded: transformer_blocks.9.attn.to_out.0.smooth_factor_orig
lora key not loaded: transformer_blocks.9.attn.to_out.0.wscales
lora key not loaded: transformer_blocks.9.attn.to_qkv.bias
lora key not loaded: transformer_blocks.9.attn.to_qkv.proj_down
lora key not loaded: transformer_blocks.9.attn.to_qkv.proj_up
lora key not loaded: transformer_blocks.9.attn.to_qkv.qweight
lora key not loaded: transformer_blocks.9.attn.to_qkv.smooth_factor
lora key not loaded: transformer_blocks.9.attn.to_qkv.smooth_factor_orig
lora key not loaded: transformer_blocks.9.attn.to_qkv.wscales
lora key not loaded: transformer_blocks.9.img_mlp.net.0.proj.bias
lora key not loaded: transformer_blocks.9.img_mlp.net.0.proj.proj_down
lora key not loaded: transformer_blocks.9.img_mlp.net.0.proj.proj_up
lora key not loaded: transformer_blocks.9.img_mlp.net.0.proj.qweight
lora key not loaded: transformer_blocks.9.img_mlp.net.0.proj.smooth_factor
lora key not loaded: transformer_blocks.9.img_mlp.net.0.proj.smooth_factor_orig
lora key not loaded: transformer_blocks.9.img_mlp.net.0.proj.wscales
lora key not loaded: transformer_blocks.9.img_mlp.net.2.bias
lora key not loaded: transformer_blocks.9.img_mlp.net.2.proj_down
lora key not loaded: transformer_blocks.9.img_mlp.net.2.proj_up
lora key not loaded: transformer_blocks.9.img_mlp.net.2.qweight
lora key not loaded: transformer_blocks.9.img_mlp.net.2.smooth_factor
lora key not loaded: transformer_blocks.9.img_mlp.net.2.smooth_factor_orig
lora key not loaded: transformer_blocks.9.img_mlp.net.2.wscales
lora key not loaded: transformer_blocks.9.img_mod.1.bias
lora key not loaded: transformer_blocks.9.img_mod.1.qweight
lora key not loaded: transformer_blocks.9.img_mod.1.wscales
lora key not loaded: transformer_blocks.9.img_mod.1.wzeros
lora key not loaded: transformer_blocks.9.txt_mlp.net.0.proj.bias
lora key not loaded: transformer_blocks.9.txt_mlp.net.0.proj.proj_down
lora key not loaded: transformer_blocks.9.txt_mlp.net.0.proj.proj_up
lora key not loaded: transformer_blocks.9.txt_mlp.net.0.proj.qweight
lora key not loaded: transformer_blocks.9.txt_mlp.net.0.proj.smooth_factor
lora key not loaded: transformer_blocks.9.txt_mlp.net.0.proj.smooth_factor_orig
lora key not loaded: transformer_blocks.9.txt_mlp.net.0.proj.wscales
lora key not loaded: transformer_blocks.9.txt_mlp.net.2.bias
lora key not loaded: transformer_blocks.9.txt_mlp.net.2.proj_down
lora key not loaded: transformer_blocks.9.txt_mlp.net.2.proj_up
lora key not loaded: transformer_blocks.9.txt_mlp.net.2.qweight
lora key not loaded: transformer_blocks.9.txt_mlp.net.2.smooth_factor
lora key not loaded: transformer_blocks.9.txt_mlp.net.2.smooth_factor_orig
lora key not loaded: transformer_blocks.9.txt_mlp.net.2.wscales
lora key not loaded: transformer_blocks.9.txt_mod.1.bias
lora key not loaded: transformer_blocks.9.txt_mod.1.qweight
lora key not loaded: transformer_blocks.9.txt_mod.1.wscales
lora key not loaded: transformer_blocks.9.txt_mod.1.wzeros
lora key not loaded: txt_in.bias
lora key not loaded: txt_in.weight
lora key not loaded: txt_norm.weight
the error log is actually massive, I had to trim it, but someone mentioned using the Nunchaku lora loader, which resulted in a error
File "C:\Users\thesi\OneDrive\Desktop\ComfyUI-Easy-Install\ComfyUI-Easy-Install\ComfyUI\custom_nodes\nunchaku_nodes\nodes\lora\flux.py", line 114, in load_lora
assert isinstance(model_wrapper, ComfyFluxWrapper)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
AssertionError
svdq-int4_r32-qwen-image-edit-2509-lightningv2.0-4steps.safetensors is a model merged with lightningv2.0-4steps lora. load it directly with Nunchaku Model Loader.
this worked, thanks!