Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rebuild latest wheels on main for FlashAttention 2 #805

Closed
winglian opened this issue Jul 21, 2023 · 3 comments · Fixed by #806 or #816
Closed

Rebuild latest wheels on main for FlashAttention 2 #805

winglian opened this issue Jul 21, 2023 · 3 comments · Fixed by #806 or #816

Comments

@winglian
Copy link
Contributor

winglian commented Jul 21, 2023

🐛 Bug

This bug Dao-AILab/flash-attention#338 in FlashAttention 2 was only fixed more recently than the last change that added FlashAttention 2. The fix is here Dao-AILab/flash-attention@9ee0ff1. Rebuilding the wheels on main should fix this for xformers too.

@danthe3rd
Copy link
Contributor

Hi,
Thanks for opening this issue! Happy to accept a PR to update the submodule.
Otherwise I'll do it later today.

@peterjc123
Copy link

@danthe3rd Any plan to upload the latest wheels to PyPI with flash attention v2?

@tmm1
Copy link
Contributor

tmm1 commented Aug 3, 2023

Note that flash-attn 2.0.4 was released with additional bug fixes. The submodule should be bumped again.

@tmm1 tmm1 mentioned this issue Aug 3, 2023
10 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
4 participants