Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

High validation loss for depth prediction #3

Open
namsan96 opened this issue Aug 7, 2023 · 1 comment
Open

High validation loss for depth prediction #3

namsan96 opened this issue Aug 7, 2023 · 1 comment
Assignees

Comments

@namsan96
Copy link

namsan96 commented Aug 7, 2023

Hi, thanks for sharing your codebase for great work!

I'm trying to reproduce HULC++'s evaluation results in CALVIN.
I found that the validation loss of depth prediction increases very highly compared to the training's one in affordance model training, as shown in the figure below.
It'd be very helpful if you can confirm whether this is an expected behavior, or if I'm missing something while following your instructions.

I actually encountered very suspicious error during the final stage of processing validation data.
The error occurred in this line as add_norm_values tries to indexing data['training'] which appears to be reasonably missing while processing validation data.
Would this error have affected data sanity?

Thanks!

Screenshot from 2023-08-07 13-11-21

@HuiHuiSun
Copy link

Hi, thanks for sharing your codebase for great work!

I'm trying to reproduce HULC++'s evaluation results in CALVIN. I found that the validation loss of depth prediction increases very highly compared to the training's one in affordance model training, as shown in the figure below. It'd be very helpful if you can confirm whether this is an expected behavior, or if I'm missing something while following your instructions.

I actually encountered very suspicious error during the final stage of processing validation data. The error occurred in this line as add_norm_values tries to indexing data['training'] which appears to be reasonably missing while processing validation data. Would this error have affected data sanity?

Thanks!

Screenshot from 2023-08-07 13-11-21

Hi, have you reproduced the results in the paper?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants