Error - diffusers - transformer_flux.py - context_attn_output

Python311\Lib\site-packages\diffusers\models\transformers\transformer_flux.py", line 212, in forward
context_attn_output = c_gate_msa.unsqueeze(1) * context_attn_output
^^^^^^^^^^^^^^^^^^^
UnboundLocalError: cannot access local variable ‘context_attn_output’ where it is not associated with a value

Reason - Attention from attention_processor.py outputs a variable of length 1.

What is the problem?

1 Like

As the error message says, a strange-looking tensor of length is being returned from Attention. It’s similar to the issue below, but I don’t know if it’s because of xformers.