-
Notifications
You must be signed in to change notification settings - Fork 1.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Transformation Matrix From Custom Data #1360
Comments
UPDATE: So I managed to make some progress. In my pybullet simulation environment, I have the following coordinate system: Which I interpret as:
And I have the following coordinate system in Instant Ngp in the camera frame: Which I interpret as (kind of hard to tell but the Z is pointing straight out the camera): Therefore the conversion should be (X,Y,Z)->(Z,-X,-Y). During the simulation, I capture multiple images and their corresponding transformation matrices. Here is a quick video that demonstrates the motion: Screencast.from.05-26-2023.12.15.02.PM.webmSo, it goes from left to right in a semi-circular motion. I am able to replicate this motion like so: But the rotation is completely off. It seems odd to me that the translation would be correct, but the rotation wouldn't be. Here is the matrix conversion code I used to get the above image:
I don't know why this is so hard to do, but I'm struggling pretty hard with this. |
We might be struggling with what boils down to the same problem. My theory is that the problem is the camera poses found using colmap2nerf.py are flipped/rotated but that they get further transformed when inputted into instant-ngp... I haven't been able to find out what the conversion is & can't seem to find the answer anywhere. Am not well versed in c++ code, so I am not confident about finding an answr by reading through the .cu files... edit: there are functions in |
@Student204161 I actually found the issue in my code, which was a simple matrix inversion
along with some scaling. You can take a look at my whole github repo. Hopefully it helps https://github.com/FezTheImmigrant/pybullet_playground. The file worth looking into is kuka.py |
Thanks for the reply - I'm glad you worked out your problem, I fortunately solved my problem some days ago... ended up spending almost a month on it but I'm just glad it works now. I had some code errors also, but in the end I was able convert the raw volumes into the ngp coordinate system, from the nerf coordinate system (* and so I was able to do the reprojection) Have a good day |
I am using a Pybullet simulation environment to generate a sequence of images along with camera poses as input to Instant-Ngp in the form of the transformations.json file; however, I cannot figure out how convert between the Pybullet coordinate system and the Instant-Ngp coordinate system.
The pybullet camera that I am using uses the following coordinate system:
In my mind, this should be a conversion between the two camera coordinate systems, where the Instant Ngp coordinate system uses:
Therefore, something like this should suffice:
Admittedly, my understanding of linear algebra is comically terrible. If somebody could at least guide me in the right direction, I would really appreciate it. I've read through every open/closed issue that discusses the weird coordinate system that Instant Ngp uses, but I haven't found any answers to be satisfactory.
The text was updated successfully, but these errors were encountered: