Skip to content
This repository has been archived by the owner on Sep 18, 2024. It is now read-only.

Fix pruners for DataParallel support #2003

Merged
merged 3 commits into from
Feb 10, 2020

Conversation

chicm-ms
Copy link
Contributor

@chicm-ms chicm-ms commented Feb 6, 2020

No description provided.

@chicm-ms chicm-ms changed the title Fix pruners Fix pruners for DataParallel support Feb 6, 2020
self.if_init_list[k] = True

for wrapper in self.get_modules_wrapper():
wrapper.registered_buffers['if_calculated'].copy_(torch.tensor(0)) # pylint: disable=not-callable
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

which one is better? wrapper.registered_buffers['if_calculated'] or wrapper.if_calculated

Copy link
Contributor Author

@chicm-ms chicm-ms Feb 9, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just tested, wrapper.if_calculated does not work, there is no error reported, but the value is still 1 if we use wrapper.if_calculated.copy_(torch.tensor(0))

@chicm-ms chicm-ms merged commit c7d5803 into microsoft:dev-pruner-dataparallel Feb 10, 2020
@chicm-ms chicm-ms deleted the chec-fix-pruners branch February 12, 2020 09:30
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants