-
Notifications
You must be signed in to change notification settings - Fork 4.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add sigmoid to softmax loss #7616
Conversation
Commit: fafa17e, The full report is available as an artifact. Dataset:
Dataset:
Dataset:
Dataset:
Dataset:
Dataset:
|
Commit: fafa17e, The full report is available as an artifact. Dataset:
Dataset:
Dataset:
|
Commit: 0255ad6, The full report is available as an artifact. Dataset:
Dataset:
Dataset:
|
Commit: 1a5e454, The full report is available as an artifact. Dataset:
Dataset:
Dataset:
Dataset:
Dataset:
Dataset:
|
Hey @dakshvar22! 👋 To run model regression tests, comment with the Tips 💡: The model regression test will be run on Tips 💡: Every time when you want to change a configuration you should edit the comment with the previous configuration. You can copy this in your comment and customize:
|
/modeltest dataset_branch: "hint3"
include:
- dataset: ["all"]
config: ["all"] |
The model regression tests have started. It might take a while, please be patient. Used configuration can be found in the comment. |
Commit: 6d4a27a, The full report is available as an artifact. Dataset:
Dataset:
Dataset:
Dataset:
|
Co-authored-by: Melinda Loubser <32034278+melindaloubser1@users.noreply.github.com>
@wochinge I've added |
Sure, let me just give it a final glance 👍🏻 Sorry, had a busy day and only had time for the review now. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please check the 2 todos. I'm good otherwise 👍🏻
Proposed changes:
DotProductLoss
by applying sigmoid over them during training.model_confidence
to each ML component. It affects how model's confidence for each label is computed during inference. It can take three values -softmax
- Similarities between input and label embeddings are post-processed with a softmax function, as a result of which confidence for all labels sum up to 1.cosine
- Cosine similarity between input label embeddings. Confidence for each label is in the range[-1,1]
.inner
- Dot product similarity between input and label embeddings. Confidence for each label in in an unbounded range.Change autoconfig to use
constrain_similarities=True
andmodel_confidence=cosine
.Status (please check what you already did):
black
(please check Readme for instructions)