-
Notifications
You must be signed in to change notification settings - Fork 103
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature] Learning Rate Modified by Steps #2
Comments
thanks for proposal, will check that! i did notice fancy behaviour of training details, but haven't performed so thorough tests. i'll keep this open till further exploration. |
@torridgristle i gave it a try, here is my findings:
i will add and expose progressive mode as an option (and change default lrate) anyway, to encourage further experiments. please note also that:
|
further tests have shown that in most cases progressive lrate does have some impact on the composition. i would not call it "larger shapes enhancements" (sometimes it just drew significantly less elements of all kinds), but it's worth having in the options. |
@torridgristle implemented and mentioned on readme. thanks for discovering; closing now. |
I've experimented with a learning rate that changes as the steps increase due to seeing Aphantasia develop a general image very quickly, but then slowing down to make small details. I believe that my proposed alternative puts more focus on larger shapes, and less on details.
I expose the learning_rate variable and add a learning_rate_max variable in the Generate cell, remove the
optimizer = torch.optim.Adam(params, learning_rate)
line and instead add this todef train(i):
With this, I find that a learning_rate of 0.0001 and a learning_rate_max of 0.008 at the highest value works well, for 300-400 steps and about 50 samples at least.
The text was updated successfully, but these errors were encountered: