Skip to content

Commit

Permalink
update util to work for additive attention mask
Browse files Browse the repository at this point in the history
  • Loading branch information
Diana Liskovich committed Nov 8, 2021
1 parent 8f3c01e commit 8c6d250
Showing 1 changed file with 4 additions and 1 deletion.
5 changes: 4 additions & 1 deletion xformers/components/attention/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,10 @@ def maybe_merge_masks(
if att_mask is None:
att_mask = key_padding_mask
# Assumption is that False means to mask.
att_mask = att_mask.logical_and(key_padding_mask)
elif att_mask.dtype == torch.bool:
att_mask = att_mask.logical_and(key_padding_mask)
else:
att_mask = att_mask.masked_fill(~key_padding_mask, float("-inf"))

return att_mask

Expand Down

0 comments on commit 8c6d250

Please sign in to comment.