Replies: 1 comment
-
First make sure you understand the Once that makes sense, checkout the docs on It's equivalent to:
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I was playing with a simple MLP example, in the below example I compute logits inside the train loop - the loss changes, but fluctuates around 2.3.
However, passing model and computing logits inside
evaluate_loss
fixes it:Could somebody help me understand this behavior?
Beta Was this translation helpful? Give feedback.
All reactions