Skip to content

Commit 1e43d2e

Browse files
gongdao123shreyankg
authored andcommitted
[ROCM] fix native attention function call (vllm-project#13650)
1 parent 3f988ad commit 1e43d2e

File tree

1 file changed

+0
-1
lines changed

1 file changed

+0
-1
lines changed

vllm/attention/backends/rocm_flash_attn.py

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -717,7 +717,6 @@ def forward(
717717
self.num_heads,
718718
self.head_size,
719719
self.scale,
720-
causal_mask,
721720
attn_masks,
722721
)
723722
else:

0 commit comments

Comments
 (0)