- Introduction
- Project Goal
- Tech Stack
- Setup
- Key Challenges
- Hardware Development
- Control System Design
- ROS2-Teleoperation
- Localization and Odometry
- Motion Planning
- SLAM
- Future Developement
Field exploration robots have gained significant attention in recent years due to their potential applications in various domains such as agriculture, environmental monitoring, and autonomous driving research. These robots are designed to operate autonomously in diverse environments, collecting data and performing tasks that would otherwise be labor-intensive or hazardous for humans. This project focuses on developing a ROS-powered robotic
vehicle designed for field exploration
. The vehicle is capable of traversing varied ground surfaces autonomously and can be used for tasks such as autonomous field mapping, agricultural plant monitoring, and testing autonomous driving algorithms.
The primary goal of this project is to develop a ROS2-enabled four-wheel-drive vehicle with Ackermann steering
capable of waypoint navigation
and obstacle avoidance
. This vehicle will have various sensors and an onboard computer to perform Simultaneous Localization and Mapping (SLAM
). The project encompasses hardware development, sensor integration, firmware and algorithm development, and extensive testing and validation to ensure the vehicle’s performance and reliability in real-world scenarios.
- ROS2 jazzy: Middleware for robot software development.
- Raspberry Pi 5: Onboard computer for processing and communication.
- Esp32: Microcontroller for low-level motor and servo control interfaced using ArduinoIDE
- Nav2: Navigation stack for ROS 2.
- RViz2: Visualization tool for ROS 2.
- Operating System:
Ubuntu 24 LTS
noble or later - ROS 2 Distribution:
Jazzy Jelisco
- Hardware:
Raspberry Pi5
,BMX 160 IMU
,Ydlidar Tmini-Pro
,Esp32
,7V Lipo Battery
- Software:
Arduino IDE
,Python 3.12
-
Install ROS2 Jazzy:
After installing run this command once
source /opt/ros/humble/setup.bash echo "source /opt/ros/humble/setup.bash" >> ~/.bashrc source ~/.bashrc
-
Create a workspace and clone the repository
mkdir -p ~/ros2_ws/src # Create a workspace , builld and source it cd ~/ros2_ws colcon build --symlink install source install/local_setup.bash cd /pi_ws/src # To clone the repo inside src folder git clone https://github.com/TahsinOP/eYSIP-24_Field_Exploration_ROS_Vehicles.git
To avoid sourcing everytime you open a new terminal instance run this once ,
echo "source ~/ros2_ws/install/setup.bash" >> ~/.bashrc source ~/.bashrc # source bashrc as we have made changes
- Flash SD Card
- Download Ubuntu 24:
Download the
Ubuntu 24
image for Raspberry Pi from the official Ubuntu website. - Flash the SD Card:
Use tools like
Balena Etcher
/Rpi Imager
to flash the Ubuntu image to a32GB
or64GB
SD card.
- Install ROS 2 Jazzy
- Follow the ROS 2 Installation Guide: Official ROS 2 Jazzy installation instructions can be found here.
- Workspace Setup
-
Create ROS 2 Workspace:
mkdir -p ~/pi_ws/src cd ~/pi_ws colcon build --symlink install source install/local_setup.bash echo "source ~/pi_ws/install/setup.bash" >> ~/.bashrc source ~/.bashrc # source bashrc as we have made changes
-
Clone the repository
cd /pi_ws/src git clone https://github.com/TahsinOP/eYSIP-24_Field_Exploration_ROS_Vehicles.git
-
USB serial Permissions Add
dialout
togroups
sudo usermod -aG dialout $USER
- Install smbus :
sudo apt update sudo apt install python3-smbus
- Launch IMU node:
ros2 launch imu_pkg imu_rviz_robot.launch.py
- Check IMU data:
ros2 topic echo /imu_raw
-
Ensure the YDLidar is connected to a USB port and powered on
-
Launch the YDLidar node with the following command:
ros2 launch ydlidar_ros2_driver ydlidar_launch.py
-
Visualize Lidar scan data:
ros2 launch rcar_viz display.launch.py
This will launch
Rviz2
with a robot model andfixed frame
as"odom"
-
Optional: Run IMU Pose Estimation: To see the Odom data, you can run the following node (optional):
ros2 run imu_pose_estimation estimator
Alternatively, set the fixed frame to
"base_link"
in RViz to see laser data on the/scan
topic.
- Low-level Control: The primary challenge we faced in low-level control was interfacing a custom servo with five pins for which no pin diagrams or references were available online. Achieving accurate control at this level required extensive experimentation and fine-tuning.
- Localization and Odometry: We relied solely on an IMU to estimate the robot’s pose for localization and odometry. The IMU’s orientation measurements exhibited significant drift over time. Additionally, the robot’s suspension system continuously affected the accelerometer's bias.
- Motion Planning: In motion planning, traditional control algorithms used for differential or omnidirectional drives were ineffective. This required the development of an approach tailored to the dynamics of an Ackermann steering system.
- Environment Mapping: For environment mapping, we had to adapt mapping techniques to work with Ackermann kinematics. This included ensuring accurate data collection from LIDAR and integrating these readings into a coherent map. Figure 2: Vicon Odometry vs IMU Odometry
- The motor actuation and steering system features a
five-pin servo
control usingPID
for precision and DC motor control for throttle. - A custom
PCB
board was developed to ensure optimum power supply to theESP32
,motor drivers
, andaspberry Pi
. - The mechanical design involved creating laser-cut CAD models using acrylic for a
two-layered system
with mounts for all hardware components including aLIDAR
. BMX-160
IMU was interfaced and calibrated to publish accurate orientation and acceleration data on a ROS-2 network.Ydlidar Tmini-Pro
interfaced withRaspberry Pi5
onROS2 Jazzy
to publish laser data on/scan
topic
Figure 1: Prototype Vehicle
Figure 2: Control System Design
-
ESP32:
- Role: Acts as the intermediary between the motor drivers and the Raspberry Pi 5.
- Communication: Receives
angle
andPWM
values viaUART Serial
from the Raspberry Pi 5. - Outputs: Sends control signals to the motor drivers.
-
Motor Drivers:
- Role: Control the steering and throttle motors based on signals received from the ESP32.
-
Servo Motor (Steering):
- Role: Adjusts the
steering angle
of the vehicle. - Control: Receives signals from the motor driver.
- Role: Adjusts the
-
DC Motor (Throttle):
- Role: Controls the
throttle
of the vehicle. - Control: Receives signals from the motor driver.
- Role: Controls the
-
Operating System: Ubuntu 24LTS
-
Optimized Pure Pursuit Algorithm:
- Role: Calculates the desired
angle
for steering based on the currentposition
and thegoal
. - Output: Angle and PWM values sent to the ESP32 for motor control.
- Role: Calculates the desired
-
BMX160 IMU:
- Role: Provides IMU data for odometry.
- Connection: Connected via
GPIO
pins on Rasbperry Pi 5. - Data Published:
/imu_data
topic.
-
YDLidar Tmini Pro:
- Role: Provides
LiDAR
data for SLAM. - Connection: Connected via
USB
. - Data Published:
/scan
topic.
- Role: Provides
-
Odometry (Modified Kalman Filter):
- Role: Processes
IMU
and otherVicon
car velocity (corresponding to PWM) data to provide an accurate estimate of the vehicle’sposition
andvelocity
. - Output: Publishes odometry data to the
/odom
topic.
- Role: Processes
-
Operating System: ROS2 Jazzy
-
SLAM (Simultaneous Localization and Mapping):
- Components:
EKF (Extended Kalman Filter)
andAMCL (Adaptive Monte Carlo Localization)
. - Role: Utilizes LiDAR and odometry data to build and update a
map
of the environment. - Inputs: Receives
/scan
and/odom
topics.
- Components:
-
Navigation Stack (Nav2):
- Role: Handles
path planning
andnavigation
. - Inputs: Receives odometry data and processed map information.
- Outputs: Publishes path data to the
/plan
topic.
- Role: Handles
-
Goal Node:
- Role: Defines the navigation
goals
for the vehicle. - Output: Publishes
waypoints
to the/goal_topic
.
- Role: Defines the navigation
- The Low-Level Control segment, managed by the ESP32, translates commands from the Raspberry Pi into physical movements of the vehicle.
- The Onboard Raspberry Pi 5 is the central processing unit for sensor data collection and preliminary processing (IMU, LiDAR), and it runs the Pure Pursuit algorithm for immediate steering adjustments.
- The Remote PC is responsible for advanced processing tasks, including SLAM for mapping and localization, and the Nav2 stack for path planning and navigation, ensuring the vehicle reaches its defined goals efficiently.
Achieving complete low-level control of the car through keyboard commands using ROS2 for communication over a shared network is a significant milestone. This setup allows for real-time teleoperation, enabling users to control the car remotely using simple keyboard inputs. By implementing teleoperation nodes in ROS 2, commands can be sent to the robot, providing a responsive and reliable control system.
Implementation Steps:
- Install necessary ROS 2 packages for teleoperation.
- Write a ROS 2 node to capture keyboard inputs and publish velocity commands.
- Configure the robot to subscribe to these velocity commands and actuate the motors accordingly.
- Test the teleoperation setup to ensure reliable and responsive control.
How to Run the Teleop Node:
- SSH into Raspberry Pi:
ssh arms@192.168.0.171 # Change the IP address and Pi name accordingly
- Run Subscriber Node On Pi/Remote PC
ros2 run teleop_bot sub
- Run Publisher Node On Pi
If u have less computation on Pi run the publisher on the
ros2 run teleop_bot pub
remote PC
using the same command stated above
Now u can control the car using WASD
keys for movement :
-
W
- Move Forward -
A
- Turn Left -
S
- Move Backward -
D
- Turn Right -
H
- Halt -
Note
: Thethrottle
value has been kept constant for the time being, you can change the value in the subscriber script in theteleop_bot
package. Also, try running the car at25-50 PWM
range for safe teleoperation
- Modified Kalman Filter: Fused car velocity data (calculated from Vicon) with IMU accelerometer readings to get a more accurate state estimate (position and velocity) by reducing drift.
- Optimized Pure Pursuit Algorithm: Implemented an optimized version of this path-tracking algorithm to ensure appropriate steering angles using the lookahead distance concept.
- SLAM: Mapped the environment using LiDAR scan data and fused it with odometry data. This, along with the Adaptive Monte Carlo Localization (AMCL) algorithm, provided an accurate pose (position and orientation) estimate of the car within the map.