-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
transform.json with several cameras??? #3057
Comments
Merged in db93476 |
Did you face at all the problem where the transforms.json file has only two frames? Ive been struggling with this issue for the past 2 days. |
If it is a case of your transforms.json, can you share it? How many images (frames) you have? Some of the parameters I would recommend to find more matches are
If you don't find the solution, we are discussing in issue #3081 |
sure here is the transforms.json: |
If you set the tag
Instead of 1 for the whole set. Make sure you pull the latest changes (db93476) |
I have been preprocessing datasets from distinct captures (moments in time and camera sizes). I could only acquire this by using
hloc
, obtaining theimages.bin
andcameras.bin
. However there is the problem that I cannot export thetransforms.json
in which each frame is from a distinct camera (f.e. 18 frames for 18 cameras).This error arises
https://github.com/nerfstudio-project/nerfstudio/blob/main/nerfstudio/process_data/colmap_utils.py#L461
I commented this assertion and the code works, however, it is considering only the first camera and so the output transforms.json is incorrect.
Here is the
cam_id_to_camera
:and here the
frames
:In https://github.com/autonomousvision/sdfstudio/blob/master/scripts/heritage_to_nerfstudio.py#L87 there is a workaround for this, however, it requires a config file with these parameters: radius, min_track_length, voxel_size, origin.
In https://github.com/InternLandMark/LandMark?tab=readme-ov-file#prepare-dataset define this format of
transforms.json
also for multifocal sfm, see:The text was updated successfully, but these errors were encountered: