Skip to content

Commit

Permalink
Fix hunyuan video attention mask dim (#10454)
Browse files Browse the repository at this point in the history
* fix

* add coauthor

Co-Authored-By: Nerogar <[email protected]>

---------

Co-authored-by: Nerogar <[email protected]>
  • Loading branch information
2 people authored and DN6 committed Jan 15, 2025
1 parent 263b973 commit 2b432ac
Showing 1 changed file with 1 addition and 0 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -721,6 +721,7 @@ def forward(

for i in range(batch_size):
attention_mask[i, : effective_sequence_length[i], : effective_sequence_length[i]] = True
attention_mask = attention_mask.unsqueeze(1) # [B, 1, N, N], for broadcasting across attention heads

# 4. Transformer blocks
if torch.is_grad_enabled() and self.gradient_checkpointing:
Expand Down

0 comments on commit 2b432ac

Please sign in to comment.