Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Questions about the loss function #22

Open
handsomelys opened this issue Mar 14, 2023 · 9 comments
Open

Questions about the loss function #22

handsomelys opened this issue Mar 14, 2023 · 9 comments

Comments

@handsomelys
Copy link

handsomelys commented Mar 14, 2023

I'm interested in how to implement InfoNCE in your code using torch.nn.CrossEntropyLoss() function, but I don't seem to have found good learning material, can you explain to me why InfoNCE can be implemented in this way in the provided code?

In addition, I would like to ask, how should this part of the code be understood?

# head + relation -> tail
loss = self. Criterion(logits, labels)
# tail -> head + relation
loss += self. Criterion(logits[:, :batch_size].t(), labels)

How to understand "tail -> head + relation"?
I would appreciate your help! Looking forward to your reply!

@intfloat
Copy link
Owner

You can refer to the answer here: #10 (comment)

The InfoNCE loss is basically a cross entropy loss but the labels are not pre-defined like in text classification.

@handsomelys
Copy link
Author

thanks for your reply!

@cyjie429
Copy link

You can refer to the answer here: #10 (comment)

The InfoNCE loss is basically a cross entropy loss but the labels are not pre-defined like in text classification.

您好,请问是否可以仅使用head + relation -> tail?

@intfloat
Copy link
Owner

当然可以,把第二行loss += ...注释掉就行,但效果会下降一点

@cyjie429
Copy link

非常感谢

@cyjie429
Copy link

当然可以,把第二行loss += ...注释掉就行,但效果会下降一点

请问我在训练过程中将第二行 Loss+=...注释之后,在测试过程中也需要注释backward_metrics = .....吗

@cyjie429
Copy link

当然可以,把第二行loss += ...注释掉就行,但效果会下降一点

因为我每次看最终结果 好像forward metrics的效果会更好

@intfloat
Copy link
Owner

当然可以,把第二行loss += ...注释掉就行,但效果会下降一点

请问我在训练过程中将第二行 Loss+=...注释之后,在测试过程中也需要注释backward_metrics = .....吗

不需要,forward metrics 对应 给定头实体和关系,预测尾实体,backward metrics 对应 给定尾实体和关系,预测头实体。对大多数数据集来说,第一种预测任务更简单,所以metrics会好一些。

@cyjie429
Copy link

当然可以,把第二行loss += ...注释掉就行,但效果会下降一点

请问我在训练过程中将第二行 Loss+=...注释之后,在测试过程中也需要注释backward_metrics = .....吗

不需要,forward metrics 对应 给定头实体和关系,预测尾实体,backward metrics 对应 给定尾实体和关系,预测头实体。对大多数数据集来说,第一种预测任务更简单,所以metrics会好一些。

懂了 谢谢

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants