S3E: A Multi-Robot Multimodal Dataset for Collaborative SLAM

Dapeng Feng1, Yuhua Qi1, Shipeng Zhong1, Zhiqiang Chen1,

Qiming Chen2, Hongbo Chen1, Jin Wu3, and Jun Ma4

1Sun Yat-sen University

2South China Agricultural University

3The Hong Kong University of Science and Technology

4The Hong Kong University of Science and Technology (Guangzhou)

Accepted by IEEE Robotics and Automation Letters (RA-L)

Abstract

The burgeoning demand for collaborative robotic systems to execute complex tasks collectively has intensified the research community's focus on advancing simultaneous localization and mapping (SLAM) in a cooperative context. Despite this interest, the scalability and diversity of existing datasets for collaborative trajectories remain limited, especially in scenarios with constrained perspectives where the generalization capabilities of Collaborative SLAM (C-SLAM) are critical for the feasibility of multi-agent missions. Addressing this gap, we introduce S3E, an expansive multimodal dataset. Captured by a fleet of unmanned ground vehicles traversing four distinct collaborative trajectory paradigms, S3E encompasses 13 outdoor and 5 indoor sequences. These sequences feature meticulously synchronized and spatially calibrated data streams, including 360-degree LiDAR point cloud, high-resolution stereo imagery, high-frequency inertial measurement units (IMU), and Ultra-wideband (UWB) relative observations. Our dataset not only surpasses previous efforts in scale, scene diversity, and data intricacy but also provides a thorough analysis and benchmarks for both collaborative and individual SLAM methodologies. For access to the dataset and the latest information, please visit our repository at https://pengyu-team.github.io/S3E .


Dormitory 1



System Overview

The S3E dataset represents a robust and extensive collection of multi-robot, multimodal data. It meticulously captures a diverse array of cooperative trajectory patterns, spanning various environments, including both outdoor and indoor settings. This comprehensive dataset facilitates in-depth analysis and understanding of robotic interactions and collaborative behaviors in different operational contexts.

sensor layout
Mobile Platform Sensor Layout and Coordinate Systems.

The S3E dataset has been meticulously compiled with a focus on achieving high temporal precision in data synchronization from a diverse array of sensors. Each sensor is meticulously calibrated to a unified timescale, ensuring that the multi-sensory data is seamlessly integrated. This level of precision is made possible by employing an advanced time synchronization mechanism, which is crucial for the accurate co-registration and integration of data across various modalities.

sensor layout
Payload Sensor and Ground Truth Device Specifications.

Furthermore, our mobile platforms are available in two specialized configurations, each designed to address specific operational requirements. These versions are tailored to provide the necessary flexibility and adaptability for diverse research and application scenarios.

  • S3Ev1.0: Customized for indoor operations, this version boasts a compact design that guarantees exceptional maneuverability, enabling it to traverse tight indoor spaces effortlessly.
  • S3Ev2.0: Tailored for outdoor missions, this version is equipped with a robust chassis and rugged wheels, ensuring optimal performance in challenging terrains.

Dataset

The S3E dataset includes three essential attributes:

  • Multi-robot Multimodal Dataset: The S3E dataset assembles a pioneering C-SLAM dataset through the deployment of three state-of-the-art ground robots. Each robot is equipped with an advanced 16-beam 3D laser scanner, two high-resolution color cameras, a 9-axis IMU, a UWB receiver, and a dual-antenna RTK receiver. Notably, this is the first C-SLAM dataset to include UWB relative distance measurements, offering a novel dimension for research.
  • Trajectory Paradigms: The S3E dataset is meticulously designed to encompass a broad spectrum of trajectory paradigms, effectively simulating a wide range of collaborative scenarios that agents are likely to encounter during SLAM operations. This comprehensive collection of data is instrumental in studying and enhancing the performance of multi-agent systems in diverse and complex environments, thereby advancing the field of cooperative robotics and autonomous navigation.
  • Diverse Challenging Environments: The S3E dataset is a rich resource that encompasses a diverse array of challenging environments, closely mirroring the conditions that C-SLAM algorithms are likely to encounter in real-world applications. Key features of this dataset include dynamic settings with moving objects, prolonged operational durations, scenarios with perceptual aliasing, indoor environments, and situations characterized by substantial motion. The inclusion of such a diverse and challenging set of conditions in the S3E dataset provides a robust platform for in-depth evaluation of C-SLAM algorithms. It enables researchers to assess not only the performance metrics but also the adaptability and overall robustness of these algorithms in a variety of operational contexts.

Trajectory Paradigms

paradigms

Collaborative Trajectory Paradigms

Note

  • (a) The Concentric Circles paradigm is ideal for missions demanding extensive coverage and detailed exploration of a defined area. It supports tight collaboration and data fusion among multiple unmanned platforms, which operate in a coordinated formation.
  • (b) The Intersecting Circles paradigm is tailored for distributed search and rescue operations. It allows unmanned platforms to expand their search area while enhancing mission efficiency through the exchange of perceptual information at points of intersection.
  • (c) The Intersection Curve paradigm is designed for large-scale distributed exploration, patrol, and mapping. It minimizes cumulative mapping errors by facilitating regular encounters and loop closures, which help to refine and align the collective map.
  • (d) The Rays paradigm is suited for scenarios where individual unmanned platforms conduct independent exploration and mapping over a vast area. Each platform operates autonomously, leveraging its own capabilities without reliance on continuous communication or data exchange with others.

Scenarios

Data Sample

Outdoor Data Sample

Data Sample

Indoor Data Sample

Note

  • Dormitory: Characterized by high pedestrian traffic and the presence of dynamic objects such as pedestrians and bicycles, dormitory areas challenge the perception and tracking abilities of unmanned platforms. Their regular architectural layouts offer an ideal setting for evaluating the precision and consistency of C-SLAM algorithms across multiple platforms.
  • Campus Road: Serving as critical connectors, campus roads are marked by long distances and expansive views. They test the endurance and large-scale exploration capabilities of unmanned platforms. Extensive data collection along these roads, including long-distance and multi-cycle datasets, provides a robust foundation for assessing the stability, accuracy, and efficiency of C-SLAM algorithms over extended operations.
  • Playground: As open areas with fewer obstructions, sports fields challenge feature extraction, registration, and optimization processes. Data collected at various times of day, including day and night, evaluates the adaptability of C-SLAM algorithms to different lighting conditions. Additionally, data from rapid motion scenarios assesses the algorithms' performance under significant motion.
  • Laboratory: Indoor environments like laboratories, with their confined spaces, complex layouts, and rich semantic content, challenge navigation, obstacle avoidance, and semantic mapping. The presence of various instruments and furniture makes laboratories suitable for testing the advanced scene understanding and mapping capabilities of C-SLAM algorithms.
  • Teaching Building and Tunnel: These areas, with their severe perceptual aliasing, pose significant challenges due to poor lighting in tunnels and similar geometric structures in corridors, which can lead to errors in data association. They test the robustness of C-SLAM algorithms in maintaining accurate positioning and mapping.

Data format

The S3E dataset offers ROS2 bag files. This format allows for efficient data management and playback, enabling users to simulate and analyze the dataset in various software environments. For the purpose of simplifying data replay and ensuring seamless access to the synchronized sensor streams from our robotic platforms, we have implemented a consolidation strategy. Specifically, we merge the sensor data collected from the Alpha, Blob, and Carol robots during the same operational sequence into a single ROS2 bag file.

We present the ROS2 Topics along with explanations of each message as follows:

rostopic

Note

The table below provides a detailed breakdown of the individual data fields contained within the Ultra-Wideband (UWB) dataset. Each field is described to ensure clarity on the type of data it represents and its intended use in analyses.

uwb

Download

Important

The dataset consists of ROS2 bag files and their corresponding ground truth pose data. These files are available for download from Dataset

Tip

For users who are not familiar with ROS2, we offer a development toolkit designed to facilitate the extraction of data from ROS2 bag files. This toolkit can be accessed and downloaded from rosbag_extractor

Qualitative results

Known Issues

While the S3E dataset represents a significant contribution to the field of C-SLAM, it is not without its limitations. Here, we acknowledge and elaborate on some of the known issues associated with the dataset:

1. Scalability Concerns

Although the S3E dataset is large-scale, it may not fully encapsulate the scalability challenges that C-SLAM systems could face in environments with a significantly higher number of agents or within more expansive operational spaces. The current dataset size and structure may not reflect the complexities of very large-scale deployments.

2. Limited Robot Platform Diversity

The dataset has been captured using a specific type of UGV. This could potentially limit the generalizability of the findings to other robot platforms with different kinematic and dynamic properties. The performance of C-SLAM algorithms may vary across diverse robot morphologies and sensing capabilities.

3. Environmental Coverage

The S3E dataset provides a broad range of environments, but it may not cover all possible real-world scenarios. Particularly, it might lack representation of environments with unique conditions or extreme weather situations that could affect the performance of C-SLAM systems. The absence of such conditions could limit the robustness of algorithms tested with this dataset in all possible operational environments.

4. Sensor Synchronization and Calibration

Despite the meticulous synchronization and calibration processes, maintaining perfect synchronization across all sensors in every dynamic scenario can be challenging. There might be minor discrepancies that could affect the accuracy of the fused dataset, especially in highly dynamic or rapidly changing environments.

Acknowledgments

This research greatly benefited from the guidance and expertise of many contributors. We extend our profound gratitude to our colleagues, Yizhen Yin and Haoxin Zhang from Sun Yat-sen University, for significantly improving our dataset's quality and applicability, especially in data collection. Special acknowledgment goes to Prof. Tao Jiang and his student Yudu Jiao from Chongqing University for evaluating the single-agent SLAM algorithms on our dataset.

Citation

@ARTICLE{feng2024s3e,
    author={Feng, Dapeng and Qi, Yuhua and Zhong, Shipeng and Chen, Zhiqiang and Chen, Qiming and Chen, Hongbo and Wu, Jin and Ma, Jun},
    journal={IEEE Robotics and Automation Letters}, 
    title={S3E: A Multi-Robot Multimodal Dataset for Collaborative SLAM}, 
    year={2024},
    volume={},
    number={},
    pages={1-8},
    keywords={Multi-Robot SLAM;Data Sets for SLAM;SLAM},
    doi={10.1109/LRA.2024.3490402}
}