We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
请问在pebg_dkt.py中第164行 train_loss /= batch_loss,为什么每一个epoch的train_loss要除以batch_loss,不是应该除以batch_size吗?
pebg_dkt.py
train_loss /= batch_loss
batch_size
另外我运行了您的代码,发现tarin_loss基本维持不变,而且前几个epoch的auc值已经接近best_auc了,我不知道继续训练的意义是什么,希望你可以抽空解答一下。
The text was updated successfully, but these errors were encountered:
估计已经毕业好几年了
Sorry, something went wrong.
No branches or pull requests
请问在
pebg_dkt.py
中第164行train_loss /= batch_loss
,为什么每一个epoch的train_loss要除以batch_loss,不是应该除以batch_size
吗?另外我运行了您的代码,发现tarin_loss基本维持不变,而且前几个epoch的auc值已经接近best_auc了,我不知道继续训练的意义是什么,希望你可以抽空解答一下。
The text was updated successfully, but these errors were encountered: