Screenshot from 2019-06-18 18-49-30.png

In this challenge one needs to estimate trajectory of a robot given an input sequence of frames. DISCOMAN dataset used for this contest presents novel challenges for SLAM and Visual Odometry methods as it contains many low-texture surfaces (e.g. white walls), and is composed of realistic robot-like motion patterns with fast rotations. In order to compare the effect of using different modalities for trajectory estimation we generate RGB-D and IMU data.

We ask participants to provide information which modalities (RGB, depth, IMU) were used in their submission.

The data is split into train, validation and test parts. Train and validation parts contain ground truth positions and orientations of the camera. Test part is used for evaluation of the methods.

Evaluation protocol

All contest entries will be evaluated using ATE (absolute trajectory error), that directly measures the difference between points of the true and the estimated trajectory. For that we first align the true and the estimated trajectory using singular value decomposition. Then, we compute the difference between each pair of poses, and output the mean/median/standard deviation of these differences. The metric is computed only for the frames with indexes that are multiples of 5, that correspond to the images.

Result format

The evaluation script works with the following result format. The results for test trajectories should be packed in zip archive containing files XXXXXX.csv, where XXXXXX is an id of the sequence (e.g. 000200.csv). The csv files contain the following fields:

  • id - every 5th frame

  • position.x, position.y, position.z - coordinates of the camera in any right handed coordinate system with Y-axis up

  • quaternion.w, quaternion.x, quaternion.y, quaternion.z - orientation of the camera in a form of quaternion

Dataset will be available after July 15 2019 [LINK]

Toolkit will be available after June 15 2019 [LINK]

Scoring will be available after June 15 2019 [LINK]