Asking for some clarification regarding the iterations #73
Unanswered
FirestName
asked this question in
Q&A
Replies: 1 comment 1 reply
-
Tested this. Seems indeed that the batch size of 8 is better than 32. Best regards |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello,
As I've understood; one iteration equals processing of one batch?
If so why is the training progress calculated accordingly?
In my naivety just looking at the number of samples processed would seem more intuitive.
I mean as for stylegan training the batch size is variable but the training progress is still counted per (kilo)images presented to the discriminator. Thus the training progress is substantially impacted by the available Vram since more images can be processed per batch. Obviously the batch size used also affects the end result as that controls the gradient accumulated before back propagation (or something).
For me this begs the question: is there something intrinsically different going on here so that it's indeed more appropriate to track the training according to iterations vs samples processed?
yeah, obviously the best would be to just monitor them tensorboards and assess from there, but this just piqued my intrigue.
Best regards
Firest
Beta Was this translation helpful? Give feedback.
All reactions