Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix conversion with rtmdet-inst, vit, conformer #2453

Merged
merged 2 commits into from
Sep 22, 2023

fix scaled_dot_product_attention

b6cca22
Select commit
Loading
Failed to load commit list.
Merged

Fix conversion with rtmdet-inst, vit, conformer #2453

fix scaled_dot_product_attention
b6cca22
Select commit
Loading
Failed to load commit list.
This check has been archived and is scheduled for deletion. Learn more about checks retention
Codecov / codecov/patch failed Sep 21, 2023 in 0s

31.25% of diff hit (target 48.40%)

View this Pull Request on Codecov

31.25% of diff hit (target 48.40%)

Annotations

Check warning on line 69 in mmdeploy/pytorch/functions/multi_head_attention_forward.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

mmdeploy/pytorch/functions/multi_head_attention_forward.py#L69

Added line #L69 was not covered by tests

Check warning on line 71 in mmdeploy/pytorch/functions/multi_head_attention_forward.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

mmdeploy/pytorch/functions/multi_head_attention_forward.py#L71

Added line #L71 was not covered by tests

Check warning on line 74 in mmdeploy/pytorch/functions/multi_head_attention_forward.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

mmdeploy/pytorch/functions/multi_head_attention_forward.py#L74

Added line #L74 was not covered by tests

Check warning on line 76 in mmdeploy/pytorch/functions/multi_head_attention_forward.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

mmdeploy/pytorch/functions/multi_head_attention_forward.py#L76

Added line #L76 was not covered by tests

Check warning on line 81 in mmdeploy/pytorch/functions/multi_head_attention_forward.py

See this annotation in the file changed.

@codecov codecov / codecov/patch

mmdeploy/pytorch/functions/multi_head_attention_forward.py#L78-L81

Added lines #L78 - L81 were not covered by tests