Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Which performance is better between confidence only and learning loss #12

Open
williamwfhuang opened this issue Jan 6, 2021 · 4 comments

Comments

@williamwfhuang
Copy link

Hi Mephisto,
I have a concern need to be discussed with you.

As title, I don't know which is better on different setting.

If we just to add the weak img by confidence of pure network for training in every new cycle, maybe we can get same as or better than this paper experiment result.

Have you tried this setting of experiment?

Best regards.
William

@Mephisto405
Copy link
Owner

Can you clarify each word of 'weak img by confidence'??? I couldn't get it.

@williamwfhuang
Copy link
Author

Sorry for confusing you. Because the aim of "Learning loss" is according to the loss of the loss module to filter the weak feature of image for currently model, then add that for image of weak feature of this model in each lift cycle. So the principle may same as directly using the confidence behind Softmax. I don't know which is better and what's different with this experiment. Have you once tried the experiment by "confidence behind softmax" ?

@williamwfhuang
Copy link
Author

Weak img by confidence is meaning that high loss image for currently model ~ Buddy

@Mephisto405
Copy link
Owner

Aha, I understand now. Unfortunately, I have not tested that kind of approach. Not sure, but I think you can find some papers that use the approach since it is straightforward.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants