Skip to content

Commit

Permalink
fix error w/ flash-attn v2
Browse files Browse the repository at this point in the history
  • Loading branch information
tmm1 committed Aug 4, 2023
1 parent 52ca08d commit 038a4a5
Showing 1 changed file with 1 addition and 0 deletions.
1 change: 1 addition & 0 deletions xformers/ops/fmha/flash.py
Original file line number Diff line number Diff line change
Expand Up @@ -67,6 +67,7 @@ def _flash_fwd(
out_padded,
softmax_lse,
p,
_, # rng_state
) = _C_flashattention.varlen_fwd(
query,
key,
Expand Down

0 comments on commit 038a4a5

Please sign in to comment.