-
Notifications
You must be signed in to change notification settings - Fork 270
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Is there demand of the precision of Transform? #5
Comments
I have not benchmarked the approach against transformation noise, however there will be a strong correlation between the estimated transformation accuracy and calibration accuracy. In this style of calibration the translation estimate usually suffers much more from noise than the rotation estimate as it is less easily observed. |
I have get the point cloud through the VLP32-C and a sequence of 6-DOF poses throuht some sensor,when run this project, it seem that it does't get the right result.Maybe I does't use this project in right method. Can I send the data to you? |
Sure if you send me a link to the data I will have a look and see if I can find the issue. |
Did you get it working in the end? |
I will try it again.Thank you.if has any questiones,i may ask you angin. |
|
From the error message it seems like either no points were loaded or they had non-finite values. I just added a check so that should prevent the system crashing see #13 |
I just realized you said you were running on the bag I gave as an example previously. I am a little unsure how to debug now, as I just ran it on my system with the new version and it exited successfully with the following output
Could you give a similar full roslaunch output? |
I get this result. The bag is example.bag. pw@pw-cowa:~/lidar-align$ roslaunch lidar_align lidar_align.launch started roslaunch server http://pw-cowa:43353/ SUMMARYPARAMETERS
NODES ROS_MASTER_URI=http://localhost:11311 process[lidar_align-1]: started with pid [761] |
And when I run the version of 77cc771 ,the problem is disappear..... |
Hmm, a lot of changes have happened since that PR, but none that impact the point cloud processing. I will try get some time to look at what it could be tomorrow. |
I have commit a bag file to the github,can you help me to look at and give me a result,thank you. |
Tried the bag and it ran without issue on my pc, maybe try
I turned the One big thing to note is this is a motion based calibration technique and so to get a 6-dof transform out you need to give a 6-dof transform in. In your bag the system is largely undergoing planar motion and the roll and pitch values of your transform are always zero, so it is really only a 3-dof input. This makes the transformation between the sensors unobservable for some parameters and so the optimizer finds a random minima just based on noise. Because of this I think only the yaw value and maybe the x and y are usable from the output. One other point of concern is I had to ramp the The final calibration parameters I got are:
|
I try two ways "catkin build" and ""catkin_make" to compile the packagem,then roslaunch the .launch file, they all result with "[pcl::KdTreeFLANN::setInputCloud] Cannot create a KDTree with an empty input cloud! |
My ros-version is "kinect",yours is melodic, does it matter? |
Tried it on a kinetic version and turns out that was the difference. There was a divided by zero error and either pcl or Eigen had changed how they handled it in melodic so I didn't see it. Bug has been fixed in #14 thanks for the help in tracking it down. |
Thank you very much. |
if i get a realtive accurate translation x,y,z, and give a mininum range,does the result can be better? |
The main issue is you are giving a 3-dof transform. To get accurate results you need to give a full 6-dof pose taken while the system undergoes highly non-planar motion. High accuracy also requires accurate time-stamping. |
Hi Zachary, I have also tried your optimizer and was wondering how much movement is required in all directions. As some reference I was using a bag file that I have created from the KITTI dataset and used the output of the OxTs robot_localization as the odom source (world -> base_link). However, the results are not consistent with the transform from the dataset calibration (tf: base_link -> velo_link). Do you think moving in a car is suitable for a proper calibration or is it merely 3-DoF? Thanks anyways for the work. |
No translational motion is needed, however large changes in roll, pitch and yaw must be provided or the calculated translation will have large errors. Any system that moves exclusively along a planar surface cannot be accurately calibrated by this method. |
Hi Zach, Thanks for the quick response. Okay this seems reasonable. However, if a merely do a local optimization using the "correct" transformation from the tf as the initial guess the results should be stable around this initial guess, right? Maybe I am doing something wrong? Is the rotation of the initial guess provided in radians or degree? |
The cost function will always change by a small amount for different inputs due to the limited accuracy of floating point numbers. Because of this if an element is unobservable I think it will be pretty random where the local optimization ends up. The rotational part of the initial guess is generated by taking the rotation in angle-axis form and multiplying the normalized axis by the angle in radians. This is the angle representation used internally by the optimizer. |
Hi Zachary, |
@miracle629 I think its likely that some nans are still getting into the optimization and causing this. I will have a look if I can work out how. Which dataset are you using it on? |
Dear Zach, Cheers |
You can download a dataset I tested things on at https://drive.google.com/open?id=11fUwbVnvej4NZ_0Mntk7XJ2YrZ5Dk3Ub however I think it is probably far from an optimal example as its just me holding the lidar and randomly waving it around. Rovio was used to get the transforms from a VI sensor I had taped to the lidars base. |
Hi Zachary, |
The output will just take the same units as the input. The only constraint is that the input pointclouds and pose are measured in the same units. It is possible that under extreme conditions the floating point representation will fail to correctly transform the data but I don't believe this would be happening for this magnitude of input. The error occurs if none of the points in the pointclouds are transformed to finite coordinates. I check for and remove Nans and infs on the input so the transformation class must have an issue with this data. |
Either that or the transformations are not being read correctly and so are all invalid, I just realized I don't do many checks that the loaded poses are valid |
hi, I want to use this tool, too. My IMU&VLP16 can both work , but how can I make a bag with odom data. |
@ZacharyTaylor ,hello ,author, I'm not sure whether you can see my question,but I hope so. I think maybe the difference comes from the platform that make the threads unwork ,if you have some advice ,please give a help,thanks! |
请问这个问题解决了吗 |
@ZacharyTaylor ,hello, can you privide me a dataset for test ??? |
No description provided.
The text was updated successfully, but these errors were encountered: