Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

raise AttributeError("'{}' object has no attribute '{}'".format( AttributeError: 'AttentionBlock' object has no attribute 'to_k' #5

Open
Zetianuser opened this issue Apr 1, 2024 · 2 comments

Comments

@Zetianuser
Copy link

I run the demo.py and meet this bug, how to deal with it?
raise AttributeError("'{}' object has no attribute '{}'".format(
AttributeError: 'AttentionBlock' object has no attribute 'to_k'

@xvjiarui
Copy link
Owner

xvjiarui commented Apr 5, 2024

Hi @Zetianuser
Sorry for the late reply. Could you try diffusers 0.14.0?

@Zetianuser
Copy link
Author

Thank you for your reply. I have tried diffusers 0.14.0 but still have some error:
Cannot load <class 'improv.modeling.meta_arch.modules.VQDecoder'> from /root/.cache/huggingface/hub/models--xvjiarui--IMProv-v1-0/snapshots/bc9d733ae2fc403600f87f8374b2eb544e6bd6dd/image_decoder because the following keys are missing:
decoder.up_blocks.0.attentions.1.value.weight, decoder.up_blocks.0.attentions.2.query.bias, decoder.up_blocks.0.attentions.2.query.weight, decoder.up_blocks.0.attentions.1.proj_attn.bias, decoder.up_blocks.0.attentions.0.query.bias, decoder.up_blocks.0.attentions.0.value.bias, decoder.up_blocks.0.attentions.2.key.bias, decoder.mid_block.attentions.0.key.weight, decoder.up_blocks.0.attentions.1.key.bias, decoder.mid_block.attentions.0.query.weight, decoder.up_blocks.0.attentions.1.proj_attn.weight, decoder.up_blocks.0.attentions.2.proj_attn.bias, decoder.up_blocks.0.attentions.1.key.weight, decoder.mid_block.attentions.0.proj_attn.weight, decoder.up_blocks.0.attentions.2.value.weight, decoder.up_blocks.0.attentions.1.query.weight, decoder.up_blocks.0.attentions.0.key.weight, decoder.mid_block.attentions.0.query.bias, decoder.up_blocks.0.attentions.0.value.weight, decoder.up_blocks.0.attentions.1.query.bias, decoder.mid_block.attentions.0.value.bias, decoder.up_blocks.0.attentions.2.value.bias, decoder.mid_block.attentions.0.proj_attn.bias, decoder.up_blocks.0.attentions.0.query.weight, decoder.up_blocks.0.attentions.1.value.bias, decoder.mid_block.attentions.0.value.weight, decoder.up_blocks.0.attentions.2.proj_attn.weight, decoder.mid_block.attentions.0.key.bias, decoder.up_blocks.0.attentions.0.proj_attn.bias, decoder.up_blocks.0.attentions.2.key.weight, decoder.up_blocks.0.attentions.0.key.bias, decoder.up_blocks.0.attentions.0.proj_attn.weight.
Please make sure to pass low_cpu_mem_usage=False and device_map=None if you want to randomely initialize those weights or else make sure your checkpoint file is correct.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants