Skip to content

Commit

Permalink
fix: flash attention import
Browse files Browse the repository at this point in the history
  • Loading branch information
OrangeSodahub committed Nov 10, 2022
1 parent c9383cf commit 5cedb3d
Showing 1 changed file with 2 additions and 1 deletion.
3 changes: 2 additions & 1 deletion server/clip_server/model/model.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,9 +27,10 @@
from open_clip.factory import _MODEL_CONFIGS

# Use flash attention
FLASH_ATTENTION_AVAILABLE = True
try:
from clip_server.model.flash_attention import MultiheadAttention

FLASH_ATTENTION_AVAILABLE = True
except:
FLASH_ATTENTION_AVAILABLE = False

Expand Down

0 comments on commit 5cedb3d

Please sign in to comment.