forked from Dao-AILab/flash-attention
-
Notifications
You must be signed in to change notification settings - Fork 68
Pull requests: ROCm/flash-attention
Author
Label
Projects
Milestones
Reviews
Assignee
Sort
Pull requests list
Update fake tensor wrapper for flash_attn and falash_attn_varlen_func…
#163
opened Oct 20, 2025 by
sahirema
Loading…
1 task
[Do not merge] vllm layout varlen
WIP
work in progress
#106
opened Dec 3, 2024 by
rocking5566
•
Draft
ProTip!
Type g i on any issue or pull request to go back to the issue listing page.