Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

improve flash-attn error message #209

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open

Conversation

rpeys
Copy link

@rpeys rpeys commented Jun 4, 2024

Updating error message from "flash-attn is not installed" to "a compatible version of flash-attn is not installed," since the former error message is confusing to those who actually do have flash-attn installed, just a more recent version (not <1.0.5) in which flash_attn.flash_attention.FlashMHA is no longer included

Updating error message from "flash-attn is not installed" to "a compatible version of flash-attn is not installed," since the former error message is confusing to those who actually do have flash-attn installed, just a more recent version (not <1.0.5) in which flash_attn.flash_attention.FlashMHA is no longer included
@rpeys
Copy link
Author

rpeys commented Jun 4, 2024

actually, I am getting this warning message even in my environment that has flash-attn 1.0.4, so now I'm not sure why I'm getting the warning, and my updated error message isn't sufficient to explain it. I will write back if I figure it out

@rpeys
Copy link
Author

rpeys commented Jun 6, 2024

The reason I was getting the error even with flash-attn 1.0.4 installed was because my pytorch version was incompatible with my flash-attn version, so the import statement was failing. I still think updating the warning message may lead to less confusion.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant