-
Notifications
You must be signed in to change notification settings - Fork 36
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Questions about the loss function #22
Comments
You can refer to the answer here: #10 (comment) The InfoNCE loss is basically a cross entropy loss but the labels are not pre-defined like in text classification. |
thanks for your reply! |
您好,请问是否可以仅使用head + relation -> tail? |
当然可以,把第二行 |
非常感谢 |
请问我在训练过程中将第二行 Loss+=...注释之后,在测试过程中也需要注释backward_metrics = .....吗 |
因为我每次看最终结果 好像forward metrics的效果会更好 |
不需要,forward metrics 对应 给定头实体和关系,预测尾实体,backward metrics 对应 给定尾实体和关系,预测头实体。对大多数数据集来说,第一种预测任务更简单,所以metrics会好一些。 |
懂了 谢谢 |
I'm interested in how to implement InfoNCE in your code using torch.nn.CrossEntropyLoss() function, but I don't seem to have found good learning material, can you explain to me why InfoNCE can be implemented in this way in the provided code?
In addition, I would like to ask, how should this part of the code be understood?
How to understand "tail -> head + relation"?
I would appreciate your help! Looking forward to your reply!
The text was updated successfully, but these errors were encountered: