Skip to content

Releases: Dao-AILab/flash-attention

v2.8.3

14 Aug 17:12

Choose a tag to compare

Bump to v2.8.3

v2.8.2

24 Jul 05:45

Choose a tag to compare

Bump to v2.8.2

v2.8.1

09 Jul 18:34

Choose a tag to compare

Bump to v2.8.1

v2.8.0.post2

14 Jun 15:40

Choose a tag to compare

[CI] Build with NVCC_THREADS=2 to avoid OOM

v2.8.0.post1

14 Jun 12:53

Choose a tag to compare

[CI] Compile with ubuntu-22.04 instead of ubuntu-20.04

v2.8.0

14 Jun 05:39

Choose a tag to compare

Bump to v2.8.0

v2.7.4.post1

29 Jan 21:43

Choose a tag to compare

Drop Pytorch 2.1

v2.7.4

29 Jan 21:34

Choose a tag to compare

Bump to v2.7.4

v2.7.3

10 Jan 18:01
89c5a7d

Choose a tag to compare

Change version to 2.7.3 (#1437)

Signed-off-by: Kirthi Shankar Sivamani <[email protected]>

v2.7.2.post1

07 Dec 18:43

Choose a tag to compare

[CI] Use MAX_JOBS=1 with nvcc 12.3, don't need OLD_GENERATOR_PATH