An idea of this project is to implement the functionality object recognition on the Intel NUC10i5FNB computer, which is part of a car-like structure (image below). This robot moves using 4 wheels, where the two front wheels are controlled by a servo and the two rear wheels are powered by a motor with a nominal voltage of 12V each. The entire facility is powered by a 5000mAh Bashing lithium polymer battery by GensAce, which can be charged by connecting an external power supply to the dashboard. At the very top of the chassis, there is a HAMA camera for image recording. The board with the programmed controller accepts signals from the computer and distributes them to the wheels.
In the workspace you need to build your packages using
catkin_make
source devel/setup.bash
In order to start inference of chosen model write roslaunch darknet_ros darknet_ros.launch in the terminal
rosrun teleop_twist_keyboard teleop_twist_keyboard.py
The keys represent the following maneuvers:
• u key is responsible for driving forward with the front wheels turned to the right,
• i key is responsible for moving forward with the front wheels in the starting position,
• o key is responsible for driving forward with the front wheels turned to the left,
• j key is responsible for turning the front wheels to the right,
• k key stops movement,
• l key is responsible for turning the front wheels to the left,
• m key is responsible for driving backwards with the front wheels turned to the right,
• , key is responsible for driving backwards with the front wheels in the starting position,
• . key is responsible for driving backwards with the front wheels turned to the left.
With rosrun rqt_graph rqt_graph you should see the graph posted below
[1] Arguedas M., et al.: ROS OpenCV camera driver – https://github.com/OTL/cv_camera.
[2] Baltovski T., et al.: teleop_twist_keyboard – https://github.com/ros-teleop/teleop_twist_keyboard.
[3] Bjelonic M.: YOLO ROS: Real-Time Object Detection for ROS – https://github.com/leggedrobotics/darknet_ros.