We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
大規模バッチ学習のためのLayer-wise Adaptive Rate Scaling (LARS)を提案.
https://digitalassets.lib.berkeley.edu/techreports/ucb/text/EECS-2017-156.pdf
Yang You, Igor Gitman, Boris Ginsburg (UC Berkeley)
2017/09/16
異なるレイヤーで異なる学習率を適用する初の手法.
The text was updated successfully, but these errors were encountered:
nocotan
No branches or pull requests
一言でいうと
大規模バッチ学習のためのLayer-wise Adaptive Rate Scaling (LARS)を提案.
論文リンク
https://digitalassets.lib.berkeley.edu/techreports/ucb/text/EECS-2017-156.pdf
著者/所属機関
Yang You, Igor Gitman, Boris Ginsburg (UC Berkeley)
投稿日付(yyyy/MM/dd)
2017/09/16
概要
新規性・差分
異なるレイヤーで異なる学習率を適用する初の手法.
手法
結果
コメント
The text was updated successfully, but these errors were encountered: