Skip to content

Commit

Permalink
Special positional embedding
Browse files Browse the repository at this point in the history
  • Loading branch information
jloveric committed Nov 30, 2023
1 parent d7ac527 commit 8f71f6c
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions language_interpolation/networks.py
Original file line number Diff line number Diff line change
Expand Up @@ -343,8 +343,8 @@ def __init__(

def forward(self, x: Tensor) -> Tensor:

# Scale the input to [-0.5*max_context, 0.5*max_context] where every token is bumped by 1
# the 0th token is 0 and the max_context token is 0.5*max_context-1
# Scale the input to [-1, 1] where every token is bumped by 1/(2*max_context)
# the 0th token is -1 and the nth token is 1
# THIS LOOKS RIGHT!
xp = ((0.5 * (x + 1) + self.positional_embedding[: x.shape[1]])*2 - self.max_context)/self.max_context

Expand Down

0 comments on commit 8f71f6c

Please sign in to comment.