Skip to content
This repository has been archived by the owner on Sep 18, 2024. It is now read-only.

update level pruner to adapt to pruner dataparallel refactor #1993

Merged

Conversation

Cjkkkk
Copy link
Contributor

@Cjkkkk Cjkkkk commented Feb 4, 2020

No description provided.

@Cjkkkk
Copy link
Contributor Author

Cjkkkk commented Feb 4, 2020

1 It seems that use Tensor.tensor(False) istead of Tensor.tensor(0) will cause uncoverted NCCL type error. Fixed using Tensor.tensor(0).
2 update level pruner.
3 update examples.

@QuanluZhang QuanluZhang changed the title Dev pruner dataparallel update level pruner to adapt to pruner dataparallel refactor Feb 4, 2020
modules_to_compress = self.detect_modules_to_compress()
for layer, config in modules_to_compress:
self._instrument_layer(layer, config)
wrapper = self._wrap_modules(layer, config)
self.modules_wrapper.append(wrapper)
self.collected_activation[layer.name] = []

def _hook(module_, input_, output, name=layer.name):
if len(self.collected_activation[name]) < self.statistics_batch_num:
self.collected_activation[name].append(self.activation(output.detach().cpu()))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

have you checked that dataparallel correctly works for this line?

@QuanluZhang QuanluZhang closed this Feb 5, 2020
@QuanluZhang QuanluZhang reopened this Feb 5, 2020
@QuanluZhang QuanluZhang merged commit 4e21e72 into microsoft:dev-pruner-dataparallel Feb 10, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants