Simultanous Localization and Mapping

SLAM with Plenoptic Cameras

Research and industry proceed to build autonomous driving cars, self-navigating unmanned aerial vehicles and intelligent, mobile robots. For these tasks, systems are needed which reliably map the 3D surroundings and are able to self-localize in these created maps. Such systems or methods can be summarized under the terms simultaneous localization and mapping (SLAM).

On the other hand, during the last decade plenoptic cameras (or light field cameras) have become available as commercial products. This technology opens up a large variety of possible applications. Particularly when classical stereo camera systems require too much space. 

To the best of our knowledge, we developed the first SLAM algorithm that performs tracking and mapping for plenoptic cameras directly on the recorded micro images. The results are published on the ECCV 2018 [1]:

Visual Odometrie Dataset

We present a »Visual Odometrie Dataset for the evaluation and comparison of plenoptic, monocular and stereo camera based visual odometry and SLAM algorithms. The dataset contains 11 sequences recorded by a hand-held platform consisting of a plenoptic camera and a pair of stereo cameras. The sequences are comprising different indoor and outdoor sequences with trajectory length ranging from 25 meters up to several hundred meters. The recorded sequences show moving objects as well as changing lighting conditions.

Future Work

In the project »Komo3D sensory data fusion is derived from a plenoptic camera with a Time-of-Flight depth camera. In this way, the different qualities of both sensors are combined. In addition, object recognition in the 3D environment is based on deep machine learning techniques. With this additional information, a semantic interpretation of the environment is possible.


[1] N. Zeller, F. Quint, U. Stilla (2018): Scale-Awareness of Light Field Camera based Visual Odometry, Computer Vision - ECCV 2018, Springer Lecture Notes in Computer Science LNCS 11212, p. 732-747, »Further information and download

[2] N. Zeller, F. Quint, U. Stilla: A Synchronized Stereo and Plenoptic Visual Odometry Dataset, arXiv:1807.09372 , 2018.

[3] M. Ziebarth, N. Zeller, M. Heizmann, F. Quint: Modeling the unified measurement uncertainty of deflectometric and plenoptic 3-D sensors, J. Sens. Sens. Syst., 7, 517-533,, 2018.

[4] N. Zeller, F. Quint, U. Stilla: "From the Calibration of a Light-Field Camera to Direct Plenoptic Odometry", IEEE Journal of Selected Topics in Signal Processing", Vol. 11, Nr. 7, p. 1004-1019, 2017.

Previous publications on the topic can be found on the website of our »Visual Odometrie Dataset