Skip to content

Commit

Permalink
Removed print statement
Browse files Browse the repository at this point in the history
  • Loading branch information
tatp22 committed Jun 27, 2020
1 parent 4f5b68d commit bfb9f27
Show file tree
Hide file tree
Showing 2 changed files with 1 addition and 1 deletion.
1 change: 1 addition & 0 deletions examples/example_full_attn.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@
checkpoint_level="C1",
parameter_sharing="none",
k_reduce_by_layer=1,
full_attention=True,
)
x = torch.randn(1, 512, 16)
y = model(x)
Expand Down
1 change: 0 additions & 1 deletion linformer_pytorch/linformer_pytorch.py
Original file line number Diff line number Diff line change
Expand Up @@ -87,7 +87,6 @@ def forward(self, Q, K, V, **kwargs):
P_bar = QW/torch.sqrt(torch.tensor(self.dim).type(Q.type()))
P_bar = P_bar.softmax(dim=-1)

print(P_bar.shape)
# Only save this when visualizing
if "visualize" in kwargs and kwargs["visualize"] == True:
self.P_bar = P_bar
Expand Down

0 comments on commit bfb9f27

Please sign in to comment.