Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tips on how to tune "bound" & "scale" for a new scene? #59

Closed
JasonLSC opened this issue Apr 27, 2022 · 7 comments
Closed

Tips on how to tune "bound" & "scale" for a new scene? #59

JasonLSC opened this issue Apr 27, 2022 · 7 comments

Comments

@JasonLSC
Copy link

I have a CG-generated dataset containing correct poses and depth range, exactly like Blender dataset. I used it to train a NeRF sucessfully by NeRF-pytorch code, but I failed to use it to train a NeRF by torch-ngp.

I think it may be due to my wrong setting of "bound" & "scale" for this scene. So do you have any tips on how to tune "bound" & "scale" for a new scene?

@ashawkey
Copy link
Owner

ashawkey commented Apr 28, 2022

@JasonLSC Hi, you can try to uncomment this line to check the poses. A proper dataset should have poses like in this. You can also check colmap2nerf.py to find the code to properly transform the poses.
If the poses look good, you can set a large bound, e.g., 8, or a small scale, e.g., 0.3 and use the GUI to check if the scene fit into the bound, and then adjust the bound or the scale.

@JasonLSC
Copy link
Author

JasonLSC commented Apr 28, 2022

@ashawkey Hi, thank you for your instructions. The visualization of the poses looks like this:
image
image
image

I think it's good? Do you think the cameras are a little bit far away from the center point?

@JasonLSC
Copy link
Author

The depth range for each camera is [1.5m, 25m]. The radius of camera rig is about 5m.
The GUI visualization looks like this: (bound = 1, scale = 0.33)
image

@ashawkey
Copy link
Owner

Yes, the camera looks good. From the GUI it seems it is able to train normally? If you want the ray to reach further, you can use a larger bound.

@JasonLSC
Copy link
Author

Hi @ashawkey ,thank you for your above sugguestions. Now, I can train my own scene dataset on the torch-NGP platform normally. In the end of training, the PSNR of training views converges to 26dB (bound = 8, scale = 1) . But, it converges to 30dB when I use NVIDIA intstant-NGP testbed (aabb_scale = 8, scale = 1)

I have some questions about torch_NGP and instant_NGP:

  • Is 'bound' & 'scale' in your implementation correspond to 'aabb_scale ' & 'scale ' in instant-NGP?
  • Do you have any idea about how to save rendered training view in the instatnt-NGP testbed?
  • Why the desired resolution is correspond to the bound in torch-NGP nerf application? Is it same with the instant-NGP impelmentation?

@ashawkey
Copy link
Owner

There is still a large performance gap compared to the original implementation, it may still take some time for further optimization.

  1. Yes, they are designed to function similarly, but there are differences as stated in readme.
  2. I'm not sure about this, you could ask in their repo.
  3. If you mean that we scale desired resolution with bound, yes, it is discussed in Insufficient hash resolution harming performance #23.

@ashawkey
Copy link
Owner

closed for now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants