A Combined RGB and Depth Descriptor for SLAM with Humanoidshttps://www.hrl.uni-bonn.de/api/publications/2018/sheikh18iroshttps://www.hrl.uni-bonn.de/api/++resource++plone-logo.svg
A Combined RGB and Depth Descriptor for SLAM with Humanoids
Publication Authors
R. Sheikh;
S. Oßwald;
M. Bennewitz
Published in
IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
Year of Publication
2018
Abstract
In this paper, we present a visual simultaneous
localization and mapping (SLAM) system for humanoid robots.
We introduce a new binary descriptor called DLab that exploits
the combined information of color, depth, and intensity to
achieve robustness with respect to uniqueness, reproducibility,
and stability. We use DLab within ORB-SLAM, where we
replaced the place recognition module with a modification of
FAB-MAP that works with newly built codebooks using our
binary descriptor. In experiments carried out in simulation and
with a real Nao humanoid equipped with an RGB-D camera, we
show that DLab has a superior performance in comparison to
other descriptors. The application to feature tracking and place
recognition reveal that the new descriptor is able to reliably
track features even in sequences with seriously blurred images
and that it has a higher percentage of correctly identified similar
images. As a result, our new visual SLAM system has a lower
absolute trajectory error in comparison to ORB-SLAM and is