Skip to content
This repository has been archived by the owner on Sep 18, 2024. It is now read-only.

Fix bug for speedup module and enhance the Ut for speedup #3279

Merged
merged 5 commits into from
Jan 8, 2021

Conversation

zheng-ningxin
Copy link
Contributor

@zheng-ningxin zheng-ningxin commented Jan 7, 2021

Fix bug for speedup and enhance the UT for speedup.
(1) The generated mask tensor should be on the same device as the original tensor.
(2) Due to Cat operator, the current speedup module may add a full-ones mask for some layers, the second bug will make the speedup module skip this mask(added by fix mask conflict) and lead to a shape mismatch.

nni/compression/pytorch/speedup/infer_shape.py Outdated Show resolved Hide resolved
@@ -128,6 +128,18 @@ def generate_random_sparsity(model):
'sparsity': sparsity})
return cfg_list

def generate_random_sparsity_v2(model):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

maybe better to use ratio as a parameter, like def generate_random_sparsity(model, layer_ratio):, just personal opinion, the current implementation is fine to me.

Signed-off-by: Ningxin <Ningxin.Zheng@microsoft.com>
@zheng-ningxin zheng-ningxin merged commit c66b747 into microsoft:v2.0 Jan 8, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants