Skip to content

Commit 7a01206

Browse files
committed
fix native attention function call
Signed-off-by: gongdao123 <[email protected]>
1 parent 981f3c8 commit 7a01206

File tree

1 file changed

+0
-1
lines changed

1 file changed

+0
-1
lines changed

vllm/attention/backends/rocm_flash_attn.py

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -717,7 +717,6 @@ def forward(
717717
self.num_heads,
718718
self.head_size,
719719
self.scale,
720-
causal_mask,
721720
attn_masks,
722721
)
723722
else:

0 commit comments

Comments
 (0)