-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
PERCEPTION: Setting up sensors #252
Conversation
insanely late on this but can you add pictues/documentation on what exactly this PR is doing |
uwrt_mars_rover_drivetrain/uwrt_mars_rover_drivetrain_description/config/sensor_parameters.yaml
Show resolved
Hide resolved
uwrt_mars_rover_drivetrain/uwrt_mars_rover_drivetrain_description/urdf/sensors.macro.xacro
Outdated
Show resolved
Hide resolved
uwrt_mars_rover_drivetrain/uwrt_mars_rover_drivetrain_description/urdf/sensors.macro.xacro
Outdated
Show resolved
Hide resolved
uwrt_mars_rover_drivetrain/uwrt_mars_rover_drivetrain_description/urdf/sensors.macro.xacro
Outdated
Show resolved
Hide resolved
uwrt_mars_rover_drivetrain/uwrt_mars_rover_drivetrain_description/urdf/sensors.macro.xacro
Outdated
Show resolved
Hide resolved
also can you leave the robot with one stereo camera on the front, one stereo camera on the back, facing backwards, as well as one lidar? |
where should the lidar be? |
<pointCloudCutoff>0.5</pointCloudCutoff> | ||
<pointCloudCutoffMax>3.0</pointCloudCutoffMax> | ||
<distortionK1>0</distortionK1> | ||
<distortionK2>0</distortionK2> | ||
<distortionK3>0</distortionK3> | ||
<distortionT1>0</distortionT1> | ||
<distortionT2>0</distortionT2> | ||
<CxPrime>0</CxPrime> | ||
<Cx>0</Cx> | ||
<Cy>0</Cy> | ||
<focalLength>0</focalLength> | ||
<hackBaseline>0</hackBaseline> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Some of these paraemeters might matter but we can keep them like this for now. Let's just keep it in mind that we kept a bunch of 0s for the intrinsic camera calibration matrix (including focal lengths). Don't need to do anything about this comment.
Yea looks good to me. As along as you are sure we aren't hitting any major errors with sensors while testing and that the front/back cameras look good (general positioning of them looks good to me, and the back one has a 180 deg yaw which I wanted to see) then we can merge. Feel free to merge if you are confident. Good work! |
drivetrain.urdf.xacro
, can use the macrodefault_depth_camera_frame
ordefault_lidar_frame
to simulate either a depth camera or lidar sensor in gazebosensor_parameters.yaml
Gazebo showing the lidar rays from the sensor on one corner of the drivetrain (note there is a cube camera in the center of the drivetrain)
On RViz, these topics are used by the sensors, adding these topics to rviz will bring up a visual. (Note: if you get a "no image found" error, try ros-visualization/rqt#187 (comment))
For example image camera frame (we added some random objects in the world so there is something to "see")