Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] Support for linearly learning rate decay #1627

Merged
merged 5 commits into from
Mar 31, 2022

Conversation

Sharpiless
Copy link
Contributor

@Sharpiless Sharpiless commented Dec 29, 2021

Thanks for your contribution and we appreciate it a lot. The following instructions would make your pull request more healthy and more easily get feedback. If you do not understand some items, don't worry, just make the pull request and seek help from maintainers.

Motivation

I tried to re-implement pix2seq bug did not find linearly learning rate decay in mmdetection

Modification

add "LinearlyDecayLrUpdaterHook"

test config_file:

lr_config = dict(
    policy='LinearAnnealing',
    min_lr=0.001
)
momentum_config = dict(
    policy='LinearAnnealing',
    min_momentum=0.001,
    by_epoch=False
    # anneal_strategy='linear',
)

lr curve:
image

BC-breaking (Optional)

Does the modification introduce changes that break the backward-compatibility of the downstream repositories?
If so, please describe how it breaks the compatibility and how the downstream projects should modify their code to keep compatibility with this PR.

Use cases (Optional)

If this PR introduces a new feature, it is better to list some use cases here, and update the documentation.

Checklist

Before PR:

  • I have read and followed the workflow indicated in the CONTRIBUTING.md to create this PR.
  • Pre-commit or linting tools indicated in CONTRIBUTING.md are used to fix the potential lint issues.
  • Bug fixes are covered by unit tests, the case that causes the bug should be added in the unit tests.
  • New functionalities are covered by complete unit tests. If not, please add more unit test to ensure the correctness.
  • The documentation has been modified accordingly, including docstring or example tutorials.

After PR:

  • If the modification has potential influence on downstream or other related projects, this PR should be tested with some of those projects, like MMDet or MMCls.
  • CLA has been signed and all committers have signed the CLA in this PR.

@CLAassistant
Copy link

CLAassistant commented Dec 29, 2021

CLA assistant check
All committers have signed the CLA.

Copy link
Contributor

@teamwong111 teamwong111 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Docstring and unit tests are needed.

@zhouzaida
Copy link
Collaborator

Hi @Sharpiless , is there any progress?

@zhouzaida zhouzaida changed the title update support for linearly learning rate decay [Feature] Support for linearly learning rate decay Jan 21, 2022
Sharpiless and others added 2 commits March 7, 2022 18:32
…erHook, add unit test

add docstring

add docstring

update linear lr momentum schedule test

fix ci

Fix CI
@@ -711,6 +711,83 @@ def test_cosine_runner_hook(multi_optimizers):
hook.writer.add_scalars.assert_has_calls(calls, any_order=True)


@pytest.mark.parametrize('multi_optimziers', (True, False))
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
@pytest.mark.parametrize('multi_optimziers', (True, False))
@pytest.mark.parametrize('multi_optimizers', (True, False))

@@ -711,6 +711,83 @@ def test_cosine_runner_hook(multi_optimizers):
hook.writer.add_scalars.assert_has_calls(calls, any_order=True)


@pytest.mark.parametrize('multi_optimziers', (True, False))
def test_linear_runner_hook(multi_optimziers):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
def test_linear_runner_hook(multi_optimziers):
def test_linear_runner_hook(multi_optimizers):


# TODO: use a more elegant way to check values
assert hasattr(hook, 'writer')
if multi_optimziers:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
if multi_optimziers:
if multi_optimizers:

@HAOCHENYE HAOCHENYE requested a review from zhouzaida March 25, 2022 17:38
@zhouzaida zhouzaida merged commit 969e2af into open-mmlab:master Mar 31, 2022
@OpenMMLab-Assistant003
Copy link

Hi @Sharpiless!First of all, we want to express our gratitude for your significant PR in the MMCV project. Your contribution is highly appreciated, and we are grateful for your efforts in helping improve this open-source project during your personal time. We believe that many developers will benefit from your PR.

We would also like to invite you to join our Special Interest Group (SIG) private channel on Discord, where you can share your experiences, ideas, and build connections with like-minded peers. To join the SIG channel, simply message moderator— OpenMMLab on Discord or briefly share your open-source contributions in the #introductions channel and we will assist you. Look forward to seeing you there! Join us :https://discord.gg/UjgXkPWNqA

If you have WeChat account,welcome to join our community on WeChat. You can add our assistant :openmmlabwx. Please add "mmsig + Github ID" as a remark when adding friends:)
Thank you again for your contribution❤ @Sharpiless

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants