Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Horizontal planes not detected #14

Open
IoannisDadiotis opened this issue Jan 10, 2024 · 5 comments
Open

Horizontal planes not detected #14

IoannisDadiotis opened this issue Jan 10, 2024 · 5 comments

Comments

@IoannisDadiotis
Copy link

Hi,
I have just installed and built the packages in my catkin workspace. Then I am trying to run the anymal example with my own robot and pointcloud as input, i.e. after having changed /plane_seg/point_cloud_in topic name, and fixed_frame_ here to match my case I just run the launch files:

mon launch plane_seg_ros anymal.launch
mon launch plane_seg_ros view_plane_seg.launch

As you can see in the photo below the horizontal planes are not segmented. Why is this happening or what am I doing wrong?
Thank you
image

@mauricefallon
Copy link
Contributor

This code is quite old, but my recollection is that the code uses the sensor/robot pose to determine the 'forward direction'.
The coordinate frame is assumed to be robotics convention: x-forward, y-left and z-up.
It looks for the vector of the x-axis.

I think the problem is that the your robot coordinates here are optical/CV frame

@IoannisDadiotis
Copy link
Author

Thank you for the response.

  • Is there a newer version of the code available? I found the repo since it has been cited by very recent papers.
  • I would like to segment all planes in the pointcloud no matter the orientation (not just for locomotion purposes). It is my impression that in this case identifying the forward direction may not be needed, what do you think?

@IoannisDadiotis
Copy link
Author

IoannisDadiotis commented Jan 10, 2024

In the code I cannot really understand which direction is defined as lookDir and how this is calculated below:

Eigen::Vector3f convertRobotPoseToSensorLookDir(Eigen::Isometry3d robot_pose){
Eigen::Quaterniond quat = Eigen::Quaterniond( robot_pose.rotation() );
double r,p,y;
quat_to_euler(quat, r, p, y);
//std::cout << r*180/M_PI << ", " << p*180/M_PI << ", " << y*180/M_PI << " rpy in Degrees\n";
double yaw = y;
double pitch = -p;
double xDir = cos(yaw)*cos(pitch);
double yDir = sin(yaw)*cos(pitch);
double zDir = sin(pitch);
return Eigen::Vector3f(xDir, yDir, zDir);
}

Then the z-axis of the look pose (published in /plane_seg/look_pose) is set to coincide with lookDir (why?)

Eigen::Vector3f rz = lookDir;

Below the frames I have in my use case, where D435_head* is the reference frame of the pointcloud, odometry_world is the fixed_frame_ and the frame without name is the published look_pose.
image

@mauricefallon
Copy link
Contributor

The lookDirection is the 3D vector forward from the robot's head extracted from the head's quaternion.

In the case of your robot it would be a rotation of the optical_frame so that x+ is pointing forward (mostly) and z+ is pointing up (mostly).

As shown in this image:
image

@IoannisDadiotis
Copy link
Author

Then why the z-axis of the look pose (published in /plane_seg/look_pose) is set to coincide with lookDir in the code below?

Eigen::Vector3f rz = lookDir;

Shouldn't it be rx = lookDir instead?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants