-
Is FlashInfer v0.2 compatible with torch==2.3 and CUDA version 12.1? |
Beta Was this translation helpful? Give feedback.
Answered by
yzh119
Dec 20, 2024
Replies: 1 comment
-
Available at https://flashinfer.ai/whl/cu121/torch2.3/flashinfer/ |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
Dr-Left
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Available at https://flashinfer.ai/whl/cu121/torch2.3/flashinfer/
FA3 template requires cuda 12.4 btw.