Skip to content

Releases: Dao-AILab/flash-attention

v2.6.2

23 Jul 09:30

Choose a tag to compare

Bump to v2.6.2

v2.6.1

11 Jul 15:29

Choose a tag to compare

Bump to v2.6.1

v2.6.0.post1

11 Jul 09:55

Choose a tag to compare

[CI] Compile with pytorch 2.4.0.dev20240514

v2.6.0

11 Jul 04:35

Choose a tag to compare

Bump v2.6.0

v2.5.9.post1

26 May 22:36

Choose a tag to compare

Limit to MAX_JOBS=1 with CUDA 12.2

v2.5.9

26 May 21:02

Choose a tag to compare

Bump to 2.5.9

v2.5.8

26 Apr 17:55

Choose a tag to compare

Bump to v2.5.8

v2.5.7

08 Apr 03:15

Choose a tag to compare

Bump to v2.5.7

v2.5.6

02 Mar 06:17

Choose a tag to compare

Bump to v2.5.6

v2.5.5

21 Feb 23:59

Choose a tag to compare

Bump to v2.5.5