You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi while working East model, I have the following issue with adamw.py
Error I am getting is:
**super(AdamW, self).init(kwargs)
TypeError: init() missing 1 required positional argument: 'name'
Code is as below:
class AdamW(Optimizer):
"""Adam optimizer.
Default parameters follow those provided in the original paper.
# Arguments
lr: float >= 0. Learning rate.
beta_1: float, 0 < beta < 1. Generally close to 1.
beta_2: float, 0 < beta < 1. Generally close to 1.
epsilon: float >= 0. Fuzz factor.
decay: float >= 0. Learning rate decay over each update.
weight_decay: float >= 0. Decoupled weight decay over each update.
# References
- Adam - A Method for Stochastic Optimization
- Optimization for Deep Learning Highlights in 2017
- Fixing Weight Decay Regularization in Adam
"""
Hi while working East model, I have the following issue with adamw.py
Error I am getting is:
**super(AdamW, self).init(kwargs)
TypeError: init() missing 1 required positional argument: 'name'
Code is as below:
class AdamW(Optimizer):
"""Adam optimizer.
Default parameters follow those provided in the original paper.
# Arguments
lr: float >= 0. Learning rate.
beta_1: float, 0 < beta < 1. Generally close to 1.
beta_2: float, 0 < beta < 1. Generally close to 1.
epsilon: float >= 0. Fuzz factor.
decay: float >= 0. Learning rate decay over each update.
weight_decay: float >= 0. Decoupled weight decay over each update.
# References
- Adam - A Method for Stochastic Optimization
- Optimization for Deep Learning Highlights in 2017
- Fixing Weight Decay Regularization in Adam
"""
The text was updated successfully, but these errors were encountered: