GPU-Accelerated Next-Best-View Exploration of Articulated Scenes

Publication Authors S. Oßwald; M. Bennewitz
Published in IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
Year of Publication 2018
Abstract

Next-best-view algorithms are commonly used for

covering known scenes, for example in search, maintenance,

and mapping tasks. In this paper, we consider the problem of

planning a strategy for covering articulated environments where

the robot also has to manipulate objects to inspect obstructed

areas. This problem is particularly challenging due to the many

degrees of freedom resulting from the articulation. We propose

to exploit graphics processing units present in many embedded

devices to parallelize the computations of a greedy next-best-view

approach. We implemented algorithms for costmap computation,

path planning, as well as simulation and evaluation of viewpoint

candidates in OpenGL for Embedded Systems and benchmarked

the implementations on multiple device classes ranging from

smartphones to multi-GPU servers. We introduce a heuristic for

estimating a utility map from images rendered with strategically

placed spherical cameras and show in simulation experiments

that robots can successfully explore complex articulated scenes

with our system.

Type of Publication Conference Proceeding
Lead Image No image
Lead Image Caption
Text
Images
Teaser Image 1
Teaser Image 2 No image
Files and Media
Local Video File
Settings
Versioning enabled yes
Short name osswald18iros
Layout
Blocks { "643379d5-8a70-42d5-af78-7006c55f06d3": { "@type": "slate" } }
Blocks Layout { "items": [ "643379d5-8a70-42d5-af78-7006c55f06d3" ] }
Options
Categorization
Related Items
Contents

There are currently no items in this folder.