Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

size mismatch using the converted .bin file #22

Open
XiaoyuShi97 opened this issue Sep 5, 2023 · 0 comments
Open

size mismatch using the converted .bin file #22

XiaoyuShi97 opened this issue Sep 5, 2023 · 0 comments

Comments

@XiaoyuShi97
Copy link

Hi, thanks a lot for your great work. I am converting LoRA file of safetensor format downloaded from civitai using your format_convert.py. Then I load the converted .bin file using pipe.unet.load_attn_procs. But I get following error:

RuntimeError: Errors in loading state_dict for LoRACrossAttnProcessor:

size mismatch for to_q_lora.down.weight: copying a param with shape torch.size(128,320) from checkpoint, the shape in current model is torch.size(4, 320).

It seems to be related to the config of unet's attn processor, but I could not find corresponding documents. Could you please provides some suggestions?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant