-
Notifications
You must be signed in to change notification settings - Fork 274
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
【Hackathon No.13】 #33
Conversation
PR 格式检查通过,你的PR 将接受Paddle 专家以及开源社区的review,请及时关注PR 动态 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm
|
||
去除了Pytorch中`momentum`的相关参数。 | ||
|
||
同时,为了保持与paddle其他lrscheduler相关的api保持一致,将`base_lr`修改为`learning_rate`,`max_lr`修改为`max_learning_rate`,此修改有待商榷。 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这里我觉得如果策略只用到一个lr,用learning_rate
就可以了。如果用到两个相关联的lr,可能base_learning_rate
和max_learning_rate
意思更清楚一些
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
感谢建议
|
||
测试考虑的case如下: | ||
|
||
- 动态图,静态图,与numpy的结果保持一致; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
除了API TEST中用numpy实现验证,本地可以跟pytorch实现做验证。然后多参考其他优化器的验证代码是怎么做的。
|
||
## 命名与参数设计 | ||
|
||
API设计为`paddle.optim.lr.CyclicLR(learning_rate,max_learning_rate,step_size_up,step_size_down, mode='triangular',gamma=1.,scale_fn=None,scale_mode='cycle',last_epoch=-1,verbose=False)` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
paddle优化器相关都在 paddle.optimizer下,这里叫optim是有什么考虑还是写错了?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
不好意思,缩写写习惯了,稍后将会更新
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
你的PR 已合入community库,请进行后续代码开发,并将代码提交至paddle仓库 |
为paddle新增CyclicLR优化调度器