Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PERCEPTION: Setting up sensors #252

Merged
merged 3 commits into from
Sep 29, 2023
Merged

Conversation

meshvaD
Copy link

@meshvaD meshvaD commented Jul 21, 2023

  • in drivetrain.urdf.xacro, can use the macro default_depth_camera_frame or default_lidar_frame to simulate either a depth camera or lidar sensor in gazebo
  • all sensor parameters such as dimension and relative position are in sensor_parameters.yaml

Gazebo showing the lidar rays from the sensor on one corner of the drivetrain (note there is a cube camera in the center of the drivetrain)
Screenshot from 2023-09-13 22-43-08

On RViz, these topics are used by the sensors, adding these topics to rviz will bring up a visual. (Note: if you get a "no image found" error, try ros-visualization/rqt#187 (comment))
Screenshot from 2023-09-13 22-45-53

For example image camera frame (we added some random objects in the world so there is something to "see")
Screenshot from 2023-09-13 23-00-27

@meshvaD meshvaD linked an issue Jul 21, 2023 that may be closed by this pull request
@nico-palmar
Copy link
Contributor

insanely late on this but can you add pictues/documentation on what exactly this PR is doing

@nico-palmar
Copy link
Contributor

also can you leave the robot with one stereo camera on the front, one stereo camera on the back, facing backwards, as well as one lidar?

@meshvaD
Copy link
Author

meshvaD commented Sep 19, 2023

also can you leave the robot with one stereo camera on the front, one stereo camera on the back, facing backwards, as well as one lidar?

where should the lidar be?

Comment on lines +62 to +73
<pointCloudCutoff>0.5</pointCloudCutoff>
<pointCloudCutoffMax>3.0</pointCloudCutoffMax>
<distortionK1>0</distortionK1>
<distortionK2>0</distortionK2>
<distortionK3>0</distortionK3>
<distortionT1>0</distortionT1>
<distortionT2>0</distortionT2>
<CxPrime>0</CxPrime>
<Cx>0</Cx>
<Cy>0</Cy>
<focalLength>0</focalLength>
<hackBaseline>0</hackBaseline>
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Some of these paraemeters might matter but we can keep them like this for now. Let's just keep it in mind that we kept a bunch of 0s for the intrinsic camera calibration matrix (including focal lengths). Don't need to do anything about this comment.

@nico-palmar
Copy link
Contributor

Yea looks good to me. As along as you are sure we aren't hitting any major errors with sensors while testing and that the front/back cameras look good (general positioning of them looks good to me, and the back one has a 180 deg yaw which I wanted to see) then we can merge. Feel free to merge if you are confident. Good work!

@meshvaD meshvaD merged commit d917392 into comp-2024 Sep 29, 2023
5 of 9 checks passed
@meshvaD meshvaD deleted the meshvad/comp-2024-sensorSim branch September 29, 2023 00:40
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

PERCEPTION: Setting up sensors
2 participants