How to replace the attention module of the unet ( instead of the attn_processor) with custom attention class? #10331
Unanswered
dingbang777
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
How to replace the attention module of the unet ( instead of the attn_processor) with custom attention class?
I see usual case that use 'unet.set_attn_processor' to change the attn_processor, but what if I want to change the attention class partly in the unet? GPT tells me to do this but it the modification doesn't make effect.
Beta Was this translation helpful? Give feedback.
All reactions