We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent 3a74713 commit 3ccf00bCopy full SHA for 3ccf00b
src/transformers/models/blt/modeling_blt.py
@@ -630,7 +630,7 @@ def forward(
630
)
631
632
layer_idx = idx if self.config.cross_attn_all_layers else 0
633
- cross_attention_output, _, _ = self.cross_attn_layers[layer_idx](
+ cross_attention_output, _ = self.cross_attn_layers[layer_idx](
634
hidden_states=patch_embeds,
635
cross_attention_states=hidden_states,
636
attention_mask=cross_mask,
0 commit comments