You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, thanks a lot for your great work. I am converting LoRA file of safetensor format downloaded from civitai using your format_convert.py. Then I load the converted .bin file using pipe.unet.load_attn_procs. But I get following error:
RuntimeError: Errors in loading state_dict for LoRACrossAttnProcessor:
size mismatch for to_q_lora.down.weight: copying a param with shape torch.size(128,320) from checkpoint, the shape in current model is torch.size(4, 320).
It seems to be related to the config of unet's attn processor, but I could not find corresponding documents. Could you please provides some suggestions?
The text was updated successfully, but these errors were encountered:
Hi, thanks a lot for your great work. I am converting LoRA file of safetensor format downloaded from civitai using your format_convert.py. Then I load the converted .bin file using pipe.unet.load_attn_procs. But I get following error:
RuntimeError: Errors in loading state_dict for LoRACrossAttnProcessor:
size mismatch for to_q_lora.down.weight: copying a param with shape torch.size(128,320) from checkpoint, the shape in current model is torch.size(4, 320).
It seems to be related to the config of unet's attn processor, but I could not find corresponding documents. Could you please provides some suggestions?
The text was updated successfully, but these errors were encountered: