Releases: Dao-AILab/flash-attention
Releases · Dao-AILab/flash-attention
v0.2.4
Bump to v0.2.4
v0.2.3
Bump to v0.2.3
v0.2.2
Speed up compilation by splitting into separate .cu files
v0.2.1
Bump version to 0.2.1
Bump to v0.2.4
Bump to v0.2.3
Speed up compilation by splitting into separate .cu files
Bump version to 0.2.1