Skip to content
This repository has been archived by the owner on Sep 18, 2024. It is now read-only.

[Model Compression] MixedMaskerPruner #3627

Closed
wants to merge 16 commits into from

Conversation

J-shang
Copy link
Contributor

@J-shang J-shang commented May 10, 2021

Depend on #3507

@ultmaster ultmaster linked an issue May 12, 2021 that may be closed by this pull request
@J-shang J-shang marked this pull request as ready for review May 13, 2021 03:08


class MixedPrunerMasker(WeightMasker):
def __init__(self, model, pruner, maskers_config_dict):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ask a naive question. In class OneshotPruner, argument maskers_config_dict is passed to MixedPrunerMasker as dictionary. But we take it here directly. Would there be anything wrong?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, you can find in this line, algo_kwargs is expended,equals to key1=value1, key2=value2, ....

counter = {}
for config in config_list:
assert 'masker_name' not in config, 'maskers_config_dict should be set if use masker_name'
if 'pruning_algo' not in config:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It seems that if pruning_algo is not set by user, LevelPrunerMasker will be used by default. Users won't recognize it without reading code. May be we should told user this corner case in doc.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, it should be reminded, I will add it

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

updated.

for mask_type in masks[layer]:
assert hasattr(
name2wrapper[layer], mask_type), "there is no attribute '%s' in wrapper on %s" % (mask_type, layer)
setattr(name2wrapper[layer], mask_type, masks[layer][mask_type])
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

During forward, wrapper.weight_mask will multiple input. The releated code is here. Why don't need to set wrapper.weight_mask here? Is it correct? If it is, where we set wrapper.weight_mask with calculated masks?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, in fact, mask_type include weight_mask and bias_mask

@QuanluZhang QuanluZhang requested review from ultmaster and colorjam May 25, 2021 09:26
'sparsity': And(float, lambda n: 0 < n < 1),
Optional('op_types'): [str],
Optional('op_names'): [str],
Optional('masker_name'): str
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"masker_name"? in your example "pruning_algo" is used

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, because we convert config_list, pop "pruning_algo" and add "masker_name"

MixedMaskerPruner support config different masker in operation level.
"""

def __init__(self, model, config_list, optimizer=None, dependency_aware=False, dummy_input=None, maskers_config_dict=None):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what is maskers_config_dict used for?

_logger = logging.getLogger('torch pruner')


class MixedPrunerMasker(WeightMasker):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the class names are odd. MixedPrunerMasker, MixedMaskerPruner...

config['masker_name'] = masker_name
return config_list, maskers_config_dict

def _dependency_calc_mask(self, wrappers, channel_dsets, wrappers_idx=None, origin_wrapper=None):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

how does mixedpruner deal with dependency group?

@J-shang J-shang marked this pull request as draft May 27, 2021 01:45
@J-shang J-shang closed this Aug 19, 2021
@J-shang J-shang deleted the multi-masker branch September 13, 2021 08:26
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Multi and Auto Compressor
4 participants