-
Notifications
You must be signed in to change notification settings - Fork 14
Workshop 11 ‐ Object counting
First, synchronise your fork with the main repository on GitHub (Sync fork
) as some structural changes were made and new files were added. Then clone or pull the repository to your local PC and re-open it in the dev container using VSC. Rebuild the updated packages colcon build --symlink-install
, and source the workspace source install/setup.bash
.
The repository contains a new node counter_3d.py
demonstrating how to count the detected objects in global coordinates with a simple filter preventing double counting. The node subscribes to object_location
topic and keeps track of all detected objects. The new detection is first checked for its distance to all counted objects so far and if it is detected close to the existing object (distance below detection_threshold
), then it is ignored. This allows the robot to detect the objects from multiple viewpoints without registering multiple counts.
- To see how the counter works, launch the simulator with the default environment:
ros2 launch limo_gazebosim limo_gazebo_diff.launch.py
. - Insert a number of small red objects in Gazebo and place them around the robot so that not all of them are immediately visible. You might need to move the robot and the objects manually into a location with more free space around. Remember that you can copy and paste one object to avoid editing multiple objects.
- Run the detector node by issuing
ros2 run rob2002_tutorial detector_3d
. This time, the detector is not visualising the image processing steps to reduce processing delays but you can change that by settingvisualisation=True
if required. - Run the counter node by issuing
ros2 run rob2002_tutorial counter_3d
. The node prints out a full list of all objects in the terminal. - You can also visualise the counted objects in rviz. Run rviz, change
Fixed Frame
toodom
, addRobot Model
withDescription Topic
/robot_description
and thenAdd/By topic
/object_count_array
. You should see arrows indicating the location of individual objects.
The new counter allows the robot to register objects from different viewpoints.
- Run the keyboard teleoperation node
ros2 run teleop_twist_keyboard teleop_twist_keyboard
and slowly move the robot around so that you register also those objects which were outside of the robot's field of view. - The counter's key parameter is
detection_threshold
. Set that to different values and note its behaviour with different robot speeds and object sizes.
- Try the counter on the real robot with real colour objects. You will need to set the
real_robot
variable in the detector toTrue
and adjust the ranges of the colour filters. Try it out with the odometry first and then use the map-based navigation.