Releases: Dao-AILab/flash-attention
Releases · Dao-AILab/flash-attention
v2.5.8
Bump to v2.5.8
v2.5.7
Bump to v2.5.7
v2.5.6
Bump to v2.5.6
v2.5.5
Bump to v2.5.5
v2.5.4
Bump to v2.5.4
v2.5.3
Bump to v2.5.3
v2.5.2
Bump to v2.5.2
v2.5.1.post1
[CI] Install torch 2.3 using index
v2.5.1
Bump to v2.5.1
v2.5.0
Bump to v2.5.0