# DRL-robot-navigation **Repository Path**: drone1024/DRL-robot-navigation ## Basic Information - **Project Name**: DRL-robot-navigation - **Description**: No description available - **Primary Language**: Unknown - **License**: MIT - **Default Branch**: Melodic - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 0 - **Created**: 2025-08-01 - **Last Updated**: 2025-08-01 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README # DRL-robot-navigation **Melodic version is deprecated and will not be updated in the future.** Deep Reinforcement Learning for mobile robot navigation in ROS Gazebo simulator. Using Twin Delayed Deep Deterministic Policy Gradient (TD3) neural network, a robot learns to navigate to a random goal point in a simulated environment while avoiding obstacles. Obstacles are detected by laser readings and a goal is given to the robot in polar coordinates. Trained in ROS Gazebo simulator with PyTorch. Tested with ROS Melodic on Ubuntu 18.04 with python 3.6.9 and pytorch 1.10. To use the package with ROS Noetic, refer to the Noetic branch. Training example:

**ICRA 2022 and IEEE RA-L paper:** Some more information about the implementation is available [here](https://ieeexplore.ieee.org/document/9645287?source=authoralert) Please cite as:
``` @ARTICLE{9645287, author={Cimurs, Reinis and Suh, Il Hong and Lee, Jin Han}, journal={IEEE Robotics and Automation Letters}, title={Goal-Driven Autonomous Exploration Through Deep Reinforcement Learning}, year={2022}, volume={7}, number={2}, pages={730-737}, doi={10.1109/LRA.2021.3133591}} ``` Main dependencies: * [ROS Melodic](http://wiki.ros.org/melodic/Installation) * [PyTorch](https://pytorch.org/get-started/locally/) Clone the repository: ```shell $ cd ~ ### Clone this repo $ git clone https://github.com/reiniscimurs/DRL-robot-navigation ``` The network can be run with a standard 2D laser, but this implementation uses a simulated [3D Velodyne sensor](https://github.com/lmark1/velodyne_simulator) Compile the workspace: ```shell $ cd ~/DRL-robot-navigation/catkin_ws ### Compile $ catkin_make_isolated ``` Open a terminal and set up sources: ```shell $ export ROS_HOSTNAME=localhost $ export ROS_MASTER_URI=http://localhost:11311 $ export ROS_PORT_SIM=11311 $ export GAZEBO_RESOURCE_PATH=~/DRL-robot-navigation/catkin_ws/src/multi_robot_scenario/launch $ source ~/.bashrc $ cd ~/DRL-robot-navigation/catkin_ws $ source devel_isolated/setup.bash ### Run the training $ cd ~/DRL-robot-navigation/TD3 $ python3 velodyne_td3.py ``` To kill the training process: ```shell $ killall -9 rosout roslaunch rosmaster gzserver nodelet robot_state_publisher gzclient python python3 ``` Gazebo environment:

Rviz: