Katwijk Beach Planetary Rover Dataset

Quick Facts

This data collection corresponds to a field test performed in the beach area of Katwijk in The Netherlands (52°12’N 4°24’E). Two separate test runs were performed and the data has been divided for these two runs. The first, which consists of a ~2km-long traverse, focuses on global localisation by matching correspondences between features detectable from “orbital” (aerial) images and those seen by the rover during its traverse. The second run is a shorter traverse focusing on enahnced visual odometry with LiDAR sensing. Both tests data consist of  rover proprioceptive (wheel odometry, IMU) and exteroceptive sensing (Stereo LocCam, Stereo PanCam, ToF Cameras and LiDAR) plus DGPS groundtruth data and the ortomosaic and DEM maps generated from aerial images taken by a drone. The data should be suitable for activities in the research areas of global and relative localisation, SLAM or subtopics of those, in Global Navigation Satellite System (GNSS)-denied environments, especially with respect to planetary rovers or similar operational scenarios.

Hardware Setup

hdpr_roverThe Heavy Duty Planetary Rover (HDPR) used to collect this dataset is a fast roving platform capable of carrying a comprehensive set of payload instruments.

From the proprioceptive standpoint the rover includes an Stim300_5g IMU and encoder readings from all motor and passive joints of this rocker-bogie platform. As for the exteroceptive sensors it comprises a “LocCam” stereobench (PointGrey Bumblebee2) fix-mounted and an additional PanCam stererobench (2 monolens camera models PointGrey Grasshopper2) on top of a Pan-Tilt Unit (encoder values from PTU joints are also provided). On top of the Bumblebee2 camera there is an outdoor-rated Time-of-Flight (ToF) camera (MESA SwissRanger 4500) and a 3D Scanning LiDAR (Velodyine VLP-16).

The groudtruth data is provided by two sets of Trimble GNSS Antenna&Receiver pairs. One pair is set at the base station area and provides the rover receiver through a long-range WiFi link with the RTK corrections for precise computation of the rover antenna position (mm accuracy is guaranteed for over 95% of the datasets traverse).

The table below summarizes the sensors suite used at this test campaign:

ebee2

Finally, a Sensefly eBee UAV drone is used before each traverse, which captures Geo-referenced images with a resolution of approximately 2 cm/pixel. Digital elevation maps are also created with an accuracy in the elevation direction of approximately 20 cm. This is done to simulate images taken by the HiRise Camera onboard the Mars Reconnaissance Orbiter (MRO).

Overview of traverses

The dataset is divided in two main runs. The first one comprises two long traverses of >1km each where rover traversed through a “boulder field” made up of 212 artificial boulders that were custom made and carefully distributed to resemble boulders of a typical boulder field seen in MRO images. The second traverse contains data from the return journey to the starting location. The reason why the these two traverses are split is due to sunlight conditions. The first traverse has the sun in the back and the second one has the sun facing the rover. Here an image showing the overlay of the two traverses of the first run:

trial12overlay

For the second run, consisting of a single traverse of ~200m, artificial boulders were placed such that the density was approximately twice that of the previous traverses and the rover was driven at a slower speed. This produced images with less motion blur, and more targets to track for algorithms such as visual odometry. Here the image corresponding to the second run:

trial3overlay

Downloads Available

In order to make data files more manageable traverses have been subdivided in chunks corresponding to approximately 5 minutes of experiment run-time.


Traverse 1

           Track #            Sensors Data
1
                    [link]
2                   [link]
                  [link]
4                   [link]
5                   [link]
6                   [link]
7                   [link]
8                   [link]

And the acquired drone files and generated maps:

Raw Camera Images                         DEM                         Georeferenced Ortomosaic


Traverse 2

           Track #            Sensors Data
1
                    [link]
2                    [link]
                   [link]
4                    [link]
5                    [link]
6                    [link]

And the acquired drone files and generated maps (same as traverse 1):

Raw Camera Images                         DEM                         Georeferenced Ortomosaic


Traverse 3

           Track #            Sensors Data
1
                   [link]
2                   [link]
                  [link]
4                   [link]
5                   [link]

And the acquired drone files and generated maps:

Raw Camera Images                         DEM                         Georeferenced Ortomosaic


If needed, data corresponding to the coordinates of the fake rocks used in both traverses can be obtained in the following link.

In the following summary table the data log files content is explained:

Similarly for the UAV flight data:

ebeefiles_table

Description of sensor frames and transformations

CoordinateFramesThis figure shows all relevant coordinate frames used for each sensor in the rover. All reference frames are explained in the table below:

coordinate_frames_table

For consistency, we adopt the transformation representations described by Furgale et al. in [REF]. Thus, rotations are described by an axis angle representation, with a unit vector axis, a, and an angle, θ, that can be used to construct a rotation matrix by

rotational_matrixwhere aˆT is the transpose of a and aˆx is the skew-symmetric matrix operator:

skew-matrixTransformations from a reference frame a to frame b are expressed as 4×4 matrix formed by the Rotational matrix and translation vector from a to b:

transformation_matrixThe ground-truth pose of a given sensor is determined by transforming the GPS measurements and IMU orientation estimates into the sensor frame. For instance, a point, plr , expressed in the right PanCam frame would be expressed in the GPS frame by:

transformation_matrix_exampleIn the table below the translation and rotation parameters to conform the 4 x 4 transformation matrices between several reference frames are given:

transformations_frames_table Note that the PanCam sensors are mounted on a Pan-Tilt unit that periodically rotated the stereo bench during the traverse to cover a wider field of view. Therefore, the transformation that describes this sensor is dependent on the corresponding Pan-Tilt unit’s measurements at each images time stamp, and therefore is computed as:

transformations_panCamwhere  and  are shorthand for sinθ and cosθ respectively, and θp and θt are the measured pan and tilt angles respectively.

Helpful tools

The following Matlab files are provided to assist processing of data.

Camera calibration data:
Panoramic camera
Localization camera
Time-of-Flight camera

Scripts:
Frame Trasformations
Image2Point

Credits

The rover development and setup to run this dataset field test campaign was lead by Robert A. Hewitt and Evangelos Boukas, both PhD candidates through the Networking Partnering Initiative (NPI) programme of ESA and worked at the Robotics Section (TEC-MMA) of ESTEC throughout the year 2015.

_DSC6376Rob Hewitt is a Ph.D. Candidate in the Mining Systems Laboratory at Queen’s University, Canada. His PhD research is focused on using the intensity measurements of active sensors like LiDAR and Time-of-Flight cameras to improve robot localization. His expertise includes state estimation, active sensors like LiDAR and Time-of-Flight cameras, machine learning, computer vision, and path planning.

_DSC6375Evangelos Boukas is an Assistant Professor in Robotics at Aalborg University Copenhagen (AAU-CPH), affiliated with the laboratory of Robotics, Vision & Machine Intelligence (RVMI) in the department of Mechanical and Manufacturing Engineering. His expertise includes robot vision, mobile robotics, state estimation, machine learning and autonomous robotics. He was awarded his PhD by the Democritus University of Thrace, Greece in 2016.

If you use the data provided by this website in your work, please use the following citation:

Hewitt and Boukas, et al. “The Katwijk beach planetary rover dataset”. International Journal of Robotics Research, 2016. Manuscript #IJR-10-1177, accepted on July 4, 2016.

Acknowledgements

This work could not have been possible without the help of the colleagues at the TEC-MMA Section of ESTEC. In particular, authors would like to thank Marco Pagnamenta, Robin Nelen, Martin Azkarate, Martin Zwick, Jakub Tomasek, Simon Wyss and Jorge Chamorro for daring to suffer along this project and specially Gianfranco Visentin for his invaluable guidance.