-
Notifications
You must be signed in to change notification settings - Fork 444
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Auto adapting batch size #2119
Auto adapting batch size #2119
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I left one comment.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why do we need to separate auto_decrease_batch_size
and auto_adapt_batch_size
?
How about unifying the parameters? auto_adapt_batch_size
could include the concept of auto_decrease_batch_size
.
IMHO, because they have different purpose. Purpose of |
How about getting a key named |
I think it's a good idea. I'll apply it after asking others' opinions. |
9bbb098
to
1dea19b
Compare
I combined separated arguments to a single one and applied other comments. Could you review my PR? @harimkang @supersoob @sungmanc |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, with minor comments
otx/algorithms/action/configs/classification/configuration.yaml
Outdated
Show resolved
Hide resolved
otx/algorithms/action/configs/classification/configuration.yaml
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks very much for applying my minor suggestion :), LGTM.
* refactore bs_search_algo * implement draft * refine big bs search algorithm * refine algorithm * enable auto adapt bs to detection task * enable auto adapt bs to other tasks * update test_automatic_bs.py * refine bs estimation algo & implement unit test * align with pre-commit * implement intg test * update changelog * fix typo * fix typo * disable auto_adapt_batch_size while HPO * update algorithm * change interface * fix typo * exclude validation in cls task * use root scale to both cases * update test code * add comment to BatchSizeAdaptType enum * refine sentence * update hpo.py * update unused test case * fix typo
Summary
Detail
This feature finds a batch size which uses almost GPU memory. It generally increases batch size, so, it can reduce a training time when comparing it to small batch size with same epochs. To make it work well, it uses some techniques. First, it tries to find just a big enough batch size instead of maximum runnable batch size. Finding a maximum value need quite much time although the advantage isn't huge compared to finding just a big enough batch size. Second, it estimates an optimal batch size using estimated equation. it can reduce much time compared to other methods. Finally, learning rate is increased sqrt(k) times,
when increasing batch size k times. It's based on experiments and also, it's supported theoretically.
How to test
You can use the feature by adding
--learning_parameters.auto_adapt_batch_size Full
.You can also test the feature by running
pytest -k auto_adapt_batch_size tests/integration/cli/
Checklist
License
Feel free to contact the maintainers if that's a concern.