Datasets

BASEPROD: The Bardenas Semi-Desert Planetary Rover Dataset

MaRTA rover at the Bardenas test campaign

This dataset acquisition was carried out in Bardenas Reales in Northern Spain in July 2023 and the data was made available in 2024. Available datasets usually include information from cameras, Lidars, and inertial sensors. The Planetary Robotics Lab (PRL) of ESA’s Automation and Robotics Section and the Space Robotics Lab (SRL) of the University of Malaga (UMA) went beyond, recording rover sensor data, including thermal information, as well as wheel force/torque measurements and Laser-Induced Breakdown Spectroscopy (LIBS) samples of rocks along the rover traverse.

ARCHES Mount Etna Dataset (AMEDS)

Interact roving on the slopes of Mount Etna

This dataset was acquired during a test campaign performed on the slopes of Mount Etna in the summer of 2022. The test campaign was organised by the DLR as part of the ARCHES project (Autonomous Robotic Networks to Help Modern Societies) in collaboration with ESOC and ESA’s Human Robotic Interaction lab (HRI). The HRI’s Interact rover was used to perform a series of 9 traverses and 2 rock picking experiments, during which the data from the onboard sensors was recorded and converted in order to produce a multi-disciplinary robotic dataset from a lunar analogue environment. The dataset contains both raw sensor data, processed data and ground truth models to allow anybody to improve, tune and put to the test various algorithms for localization, visual odometry, laser odometry, object detection and 3D mapping.

Katwijk Beach Planetary Rover Dataset

_DSC6241
HDPR roving along the beach of Katwijk

This data collection corresponds to a field test performed in the beach area of Katwijk in The Netherlands (52°12’N 4°24’E). Two separate test runs were performed and the data has been divided for these two runs. The first, which consists of a ~1.5km-long traverse, focuses on global localisation by matching correspondences between features detectable from “orbital” (aerial) images and those seen by the rover during its traverse. The second run is a shorter traverse focusing on enahnced visual odometry with LiDAR sensing. Both tests data consist of  rover proprioceptive (wheel odometry, IMU) and exteroceptive sensing (Stereo LocCam, Stereo PanCam, ToF Cameras and LiDAR) plus DGPS groundtruth data and the ortomosaic and DEM maps generated from aerial images taken by a drone. The data should be suitable for developments in the research areas of global and relative localisation, SLAM or subtopics of those.