# OpenGS-SLAM
**Repository Path**: freecode01n/OpenGS-SLAM
## Basic Information
- **Project Name**: OpenGS-SLAM
- **Description**: AI_Robot_SLAM
- **Primary Language**: Unknown
- **License**: MIT
- **Default Branch**: main
- **Homepage**: None
- **GVP Project**: No
## Statistics
- **Stars**: 1
- **Forks**: 0
- **Created**: 2025-04-01
- **Last Updated**: 2025-04-14
## Categories & Tags
**Categories**: Uncategorized
**Tags**: None
## README
RGB-Only Gaussian Splatting SLAM for Unbounded Outdoor Scenes
Sicheng Yu*
·
Chong Cheng*
·
Yifan Zhou
·
Xiaojun Yang
·
Hao Wang✉
The Hong Kong University of Science and Technology (GuangZhou)
(* Equal Contribution)
ICRA 2025
[[Project page](https://3dagentworld.github.io/opengs-slam/)],[[arxiv](https://arxiv.org/abs/2502.15633)]
# Getting Started
## Installation
1. Clone OpenGS-SLAM.
```bash
git clone https://github.com/3DAgentWorld/OpenGS-SLAM.git --recursive
cd OpenGS-SLAM
```
2. Setup the environment.
```bash
conda env create -f environment.yml
conda activate opengs-slam
```
3. Compile submodules for Gaussian splatting
```bash
pip install submodules/simple-knn
pip install submodules/diff-gaussian-rasterization
```
4. Compile the cuda kernels for RoPE (as in CroCo v2 and DUSt3R).
```bash
cd croco/models/curope/
python setup.py build_ext --inplace
cd ../../../
```
Our test setup was:
- Ubuntu 20.04: `pytorch==2.1.0 torchvision==0.16.0 torchaudio==2.1.0 cudatoolkit=11.8`
- NVIDIA RTX A6000
## Checkpoints
You can download the *'DUSt3R_ViTLarge_BaseDecoder_512_dpt.pth'* checkpoint from the [DUSt3R](https://github.com/naver/dust3r) code repository, and save it to the 'checkpoints' folder.
Alternatively, download it directly using the following method:
```bash
mkdir -p checkpoints/
wget https://download.europe.naverlabs.com/ComputerVision/DUSt3R/DUSt3R_ViTLarge_BaseDecoder_512_dpt.pth -P checkpoints/
```
Please note that you must agree to the DUSt3R license when using it.
## Downloading Datasets
The processed data for the 9 Waymo segments can be downloaded via [baidu](https://pan.baidu.com/s/1I1rnB6B8k2d4wzcRMT6gjA?pwd=omcg ) or [google](https://drive.google.com/drive/folders/1xUyNuNzUtsvZIV_q5Qz9zIXMGoMbLuCr?usp=sharing).
Save data under the `datasets/waymo` directory.
## Run
```bash
## Taking 100613 as an example
CUDA_VISIBLE_DEVICES=0 python slam.py --config configs/mono/waymo/100613.yaml
## All 9 Waymo scenes
bash run_waymo.sh
```
## Demo
- If you want to view the real-time interactive SLAM window, please change `Results-use_gui` in `base_config.yaml` to True.
- When running on an Ubuntu system, a GUI window will pop up.
## Run on other dataset
- Please organize your data format and modify the code in `utils/dataset.py`.
- Depth map input interface is still retained in the code, although we didn't use it for SLAM.
# Acknowledgement
- This work is built on [3DGS](https://github.com/graphdeco-inria/gaussian-splatting), [MonoGS](https://github.com/muskie82/MonoGS), and [DUSt3R](https://github.com/naver/dust3r), thanks for these great works.
- For more details about Demo, please refer to [MonoGS](https://github.com/muskie82/MonoGS), as we are using its visualization code.
# Citation
If you found this code/work to be useful in your own research, please considering citing the following:
```bibtex
@article{yu2025rgb,
title={Rgb-only gaussian splatting slam for unbounded outdoor scenes},
author={Yu, Sicheng and Cheng, Chong and Zhou, Yifan and Yang, Xiaojun and Wang, Hao},
journal={arXiv preprint arXiv:2502.15633},
year={2025}
}
```