You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The input given through a forward call is expected to contain log-probabilities of each class. [...] Obtaining log-probabilities in a neural network is easily achieved by adding a LogSoftmax layer in the last layer of your network. You may use CrossEntropyLoss instead, if you prefer not to add an extra layer.
The way we use these losses does not seem to guarantee that our inputs correspond to log-probabilities. Let me know if you have thoughts about this and if not, let's maybe normalize the outputs of the comparison to adhere to the requirements
The text was updated successfully, but these errors were encountered:
@MarcoMorik
https://pytorch.org/docs/stable/generated/torch.nn.NLLLoss.html states that
The way we use these losses does not seem to guarantee that our inputs correspond to log-probabilities. Let me know if you have thoughts about this and if not, let's maybe normalize the outputs of the comparison to adhere to the requirements
The text was updated successfully, but these errors were encountered: