-
Notifications
You must be signed in to change notification settings - Fork 27.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Past CI] 🔥 Leave Past CI failures in the past 🔥 #20861
Conversation
The documentation is not available anymore as the PR was closed or merged. |
@@ -1361,7 +1361,7 @@ def forward( | |||
# [batch_size, num_candidates, retriever_proj_size] | |||
candidate_score = candidate_score.view(-1, self.config.num_candidates, self.config.retriever_proj_size) | |||
# [batch_size, num_candidates] | |||
relevance_score = torch.einsum("BD,BND->BN", query_score, candidate_score) | |||
relevance_score = torch.einsum("bd,bnd->bn", query_score, candidate_score) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
torch 1.8 complains uppercase here
@@ -3203,9 +3203,9 @@ def forward(self, inputs: torch.Tensor, pos: Optional[torch.Tensor] = None, netw | |||
if self.prep_type != "patches": | |||
# move channels to last dimension, as the _build_network_inputs method below expects this | |||
if inputs.ndim == 4: | |||
inputs = torch.permute(inputs, (0, 2, 3, 1)) | |||
inputs = inputs.permute(0, 2, 3, 1) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
torch 1.8 has no torch.permute
, but x.permute
works
The documentation is not available anymore as the PR was closed or merged. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for all the fixes!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fantastic @ydshieh, thank you for your work!
* torch.jit._state * Fix past CI * Fix for perceiver * Fix REALM * Fix for Bloom * Fix for SwinMode * Fix for TrajectoryTransformerModel * Fix for test_wav2vec2_with_lm * make style Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
* torch.jit._state * Fix past CI * Fix for perceiver * Fix REALM * Fix for Bloom * Fix for SwinMode * Fix for TrajectoryTransformerModel * Fix for test_wav2vec2_with_lm * make style Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
* torch.jit._state * Fix past CI * Fix for perceiver * Fix REALM * Fix for Bloom * Fix for SwinMode * Fix for TrajectoryTransformerModel * Fix for test_wav2vec2_with_lm * make style Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
* torch.jit._state * Fix past CI * Fix for perceiver * Fix REALM * Fix for Bloom * Fix for SwinMode * Fix for TrajectoryTransformerModel * Fix for test_wav2vec2_with_lm * make style Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
* torch.jit._state * Fix past CI * Fix for perceiver * Fix REALM * Fix for Bloom * Fix for SwinMode * Fix for TrajectoryTransformerModel * Fix for test_wav2vec2_with_lm * make style Co-authored-by: ydshieh <ydshieh@users.noreply.github.com>
What does this PR do?
Make Past CI (with
torch 1.8
) cleaner