Releases: Dao-AILab/flash-attention
Releases Β· Dao-AILab/flash-attention
v2.4.3.post1
[CI] Fix CUDA 12.2.2 compilation
v2.4.3
Bump to v2.4.3
v2.4.2
Bump to v2.4.2
v2.4.1
Bump to v2.4.1
v2.4.0.post1
[CI] Don't compile for python 3.7 pytorch 2.2
v2.4.0
Bump to v2.4.0
v2.3.6
Bump to v2.3.6
v2.3.5
Bump to v2.3.5
v2.3.4
Bump to v2.3.4
v2.3.3
Bump to v2.3.3