X-maps: Direct Depth Lookup for Event-based Structured Light Systems

1 Fraunhofer Heinrich Hertz Institute, HHI
2 Humboldt University of Berlin
The paper has been accepted at the CVPR 2023 Workshop on Event-based Vision🥳!

Live-demonstration of the depth estimation

This method presents a way to estimate depth with a stereo system of a small laser projector and an event camera. From the timestamps of incoming events, we can determine the current scanline of the projector, which we use to compute the disparity to the recorded event in the camera. Due to the application of the spatio-temporal X-maps, we can achieve real-time performance on a laptop CPU. As the depth estimation is done from the time stamps, the projected content can be freely chosen, as long as it is bright enough over the environment to produce events.

Abstract

We present a new approach to direct depth estimation for Spatial Augmented Reality (SAR) applications using event cameras. These dynamic vision sensors are a great fit to be paired with laser projectors for depth estimation in a structured light approach. Our key contributions involve a conversion of the projector time map into a rectified X-map, capturing x-axis correspondences for incoming events and enabling direct disparity lookup without any additional search. Compared to previous implementations, this significantly simplifies depth estimation, making it more efficient, while the accuracy is similar to the time map-based process. Moreover, we compensate non-linear temporal behavior of cheap laser projectors by a simple time map calibration, resulting in improved performance and increased depth estimation accuracy. Since depth estimation is executed by two lookups only, it can be executed almost instantly (less than 3 ms per frame with a Python implementation) for incoming events. This allows for real-time interactivity and responsiveness, which makes our approach especially suitable for SAR experiences where low latency, high frame rates and direct feedback are crucial. We present valuable insights gained into data transformed into X-maps and evaluate our depth from disparity estimation against the state of the art time map-based results.

BibTeX

@inproceedings{morgenstern2023cvpr,
  title        = {X-maps: Direct Depth Lookup for Event-based Structured Light Systems},
  author       = {Morgenstern, Wieland and Gard, Niklas and Baumann, Simon and Hilsmann, Anna and Eisert, Peter},
  year         = 2023,
  booktitle    = {2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW)},
  volume       = {},
  number       = {},
  pages        = {4007--4015},
  doi          = {10.1109/CVPRW59228.2023.00418},
  keywords     = {Computer vision;Codes;Conferences;Lasers;Spatial augmented reality;Estimation;Vision sensors}
}