Spatiotemporal Attention Enhances Lidar-Based Robot Navigation in Dynamic Environmentshttps://www.hrl.uni-bonn.de/api/publications/2023/deheuvel24ralhttps://www.hrl.uni-bonn.de/api/++resource++plone-logo.svg
Spatiotemporal Attention Enhances Lidar-Based Robot Navigation in Dynamic Environments
Publication Authors
J. de Heuvel;
X. Zeng;
W. Shi;
T. Sethuraman;
M. Bennewitz
Published in
IEEE Robotics and Automation Letters (RA-L), presented at the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
Year of Publication
2024
Abstract
Foresighted robot navigation in dynamic indoor environments with cost-efficient hardware necessitates the use of a lightweight yet dependable controller. So inferring the scene dynamics from sensor readings without explicit object tracking is a pivotal aspect of foresighted navigation among pedestrians. In this paper, we introduce a spatiotemporal attention pipeline for enhanced navigation based on 2D lidar sensor readings. This pipeline is complemented by a novel lidar-state representation that emphasizes dynamic obstacles over static ones. Subsequently, the attention mechanism enables selective scene perception across both space and time, resulting in improved overall navigation performance within dynamic scenarios. We thoroughly evaluated the approach in different scenarios and simulators, finding good generalization to unseen environments. The results demonstrate outstanding performance compared to state-of-the-art methods, thereby enabling the seamless deployment of the learned controller on a real robot.