Skip to content

Commit fae4f82

Browse files
Bellk17WoosukKwon
andauthored
Update vllm/attention/ops/triton_flash_attention.py
Co-authored-by: Woosuk Kwon <[email protected]>
1 parent 65f7edc commit fae4f82

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

vllm/attention/ops/triton_flash_attention.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -415,7 +415,7 @@ def attn_fwd(
415415
return
416416

417417
is_mqa = hq != hk
418-
if is_mqa:
418+
if is_mqa: # noqa: SIM108
419419
off_h_k = off_h_q % hk
420420
else:
421421
off_h_k = off_h_q

0 commit comments

Comments
 (0)