This repository has been archived by the owner on Dec 1, 2021. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 86
customization of learning rate #3
Labels
enhancement
New feature or request
Comments
We discussed about concrete specification.
If a use choose option 1 and total epochs is 100, then learning rate should be scaled to 1/10, at 50 epochs and 99 epochs. If a use choose option 2 and total epochs is 100, then learning rate should be scaled to 1/10, at 33, 66 epochs and 99 epochs. Since accuracy is high immediately after we decreased the learning rate, I'd like to keep 1 decay until (num_epochs-1) epoch. Other decay timing is just equal divide. Estimated effort for this issue would be 1 or 2 FTE days. where should be modified?
Sample of generated config lines for lmnet:
|
I wonder if this issue is assigned to anyone or not. |
I want to close this issue. |
iizukak
pushed a commit
to iizukak/blueoil
that referenced
this issue
May 30, 2019
Add little circles on pipeline diagrams
ruimashita
pushed a commit
that referenced
this issue
Jun 26, 2019
Add little circles on pipeline diagrams
Sign up for free
to subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Currently, Blueoil doesn't support customization of learning rate. It would be very nice if we can customize learning rate, since it is a very important hyperparameter.
For example, internally, we've used some learning rate scheduling, like
tf.train.piecewise_constant
,tf.train.polynomial_decay
.The text was updated successfully, but these errors were encountered: