-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support for Kaggle P100 GPU #4
Comments
Thanks for the interest! |
kuizhiqing
pushed a commit
to kuizhiqing/flash-attention
that referenced
this issue
May 26, 2023
* add for alpha_fold2 * add some extra setting * fix some bugs * fix some changes * fix some bugs 2nd * Add another initition of Gmem_tile_qkv and Gmem_tile_o * add some compensation for try..catch * fix mistake in flash_attn_fwd * commit for code style and bug check * fix some bugs for flash_attn_with_bias-mask * add more print for pointer debug * add some bug test cases. * backward function * fix bugs * make some changes for backward * Fix compiling error. * quote all printf debug * quote all printf debug and fix interface error * quote all printf debug and fix interface error, fix typo * remove all printf * split files * remove useless debug code * split fwd and bwd execution function * split fwd and bwd execution function * remove useless codes * remove useless codes * remove useless codes 3rd times * remove useless codes 4th times * Fix compiling error. * Remove const.
njhill
pushed a commit
to njhill/flash-attention
that referenced
this issue
Sep 27, 2024
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
First and foremost congratulation of a great R&D work here. Wonder if there is any plan to support Kaggle GPUs?
I suspect there will be a great deal of interest from the community there to test this.
Cheers
Dr.Patrick
The text was updated successfully, but these errors were encountered: