Skip to content
This repository has been archived by the owner on Sep 18, 2024. It is now read-only.

[#3507 follow up] update doc #3688

Merged
merged 8 commits into from
May 27, 2021
Merged

[#3507 follow up] update doc #3688

merged 8 commits into from
May 27, 2021

Conversation

J-shang
Copy link
Contributor

@J-shang J-shang commented May 27, 2021

Post-fixes for #3507

@J-shang J-shang changed the title [#3507 follow up] update doc & ut [#3507 follow up] update doc May 27, 2021
@@ -103,7 +102,8 @@ Users can also remove this collector like this:
Pruner
------

A pruner receives ``model``\ , ``config_list`` and ``optimizer`` as arguments. It prunes the model per the ``config_list`` during training loop by adding a hook on ``optimizer.step()``.
A pruner receives ``model``\ , ``config_list`` as arguments.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

remove \

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed

@@ -554,6 +559,8 @@ class ActivationMeanRankFilterPruner(IterativePruner):
The activation type.
sparsity_training_epochs: int
The number of batches to statistic the activation.
statistics_batch_num: int
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why there is no such parameter before?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

workaround by using sparsifying_training_batches

@ultmaster ultmaster merged commit 9b0bc37 into microsoft:master May 27, 2021
@J-shang J-shang deleted the update-doc branch June 4, 2021 07:18
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants