diff --git a/README.md b/README.md index fa5811c0a5d5930568fce4c6437ebb737f0a0b63..63138ca320a28cf660c21f5defd40040deca25d9 100644 --- a/README.md +++ b/README.md @@ -8,6 +8,9 @@ https://user-images.githubusercontent.com/18318646/197722969-0aaf8670-0783-48bb- The HD video is available on [Bilibili](https://www.bilibili.com/video/BV1Ze41137kx/?vd_source=78d041dc03a4aac231b5cac62feffc70). +github: [https://github.com/PengYu-Team/S3E](https://github.com/PengYu-Team/S3E) +gitee: [https://gitee.com/PengYu-Team/S3E](https://gitee.com/PengYu-Team/S3E) + ## Abstract: With the advanced request to employ a team of robots to perform a task collaboratively, the research community has become increasingly interested in collaborative simultaneous localization and mapping. Unfortunately, existing datasets are limited in the scale and variation of the collaborative trajectories they capture, even though generalization between inter-trajectories among different agents is crucial to the overall viability of collaborative tasks. To help align the research community's contributions with real-world multiagent ordinated SLAM problems, we introduce S3E, a novel large-scale multimodal dataset captured by a fleet of unmanned ground vehicles along four designed collaborative trajectory paradigms. S3E consists of 7 outdoor and 5 indoor scenes that each exceed 200 seconds, consisting of well synchronized and calibrated high-quality stereo camera, LiDAR, and high-frequency IMU data. Crucially, our effort exceeds previous attempts regarding dataset size, scene variability, and complexity. It has 4x as much average recording time as the pioneering EuRoC dataset. We also provide careful dataset analysis as well as baselines for collaborative SLAM and single counterparts.