forked from Dao-AILab/flash-attention
-
Notifications
You must be signed in to change notification settings - Fork 41
Pull requests: vllm-project/flash-attention
Author
Label
Projects
Milestones
Reviews
Assignee
Sort
Pull requests list
Add back flash_attn_func api (and support FA3) [Don't Merge Yet]
#40
opened Jan 26, 2025 by
LucasWilkinson
Loading…
ProTip!
Add no:assignee to see everything that’s not assigned.