GPU-Accelerated Next-Best-View Exploration of Articulated Scenes




Authors:

S. Oßwald, M. Bennewitz

Type:

Conference Proceeding

Published in:

IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)

Year:

2018

Links:

PDF FileVideoCode

Topic

Abstract:

Next-best-view algorithms are commonly used forcovering known scenes, for example in search, maintenance,and mapping tasks. In this paper, we consider the problem ofplanning a strategy for covering articulated environments wherethe robot also has to manipulate objects to inspect obstructedareas. This problem is particularly challenging due to the manydegrees of freedom resulting from the articulation. We proposeto exploit graphics processing units present in many embeddeddevices to parallelize the computations of a greedy next-best-viewapproach. We implemented algorithms for costmap computation,path planning, as well as simulation and evaluation of viewpointcandidates in OpenGL for Embedded Systems and benchmarkedthe implementations on multiple device classes ranging fromsmartphones to multi-GPU servers. We introduce a heuristic forestimating a utility map from images rendered with strategicallyplaced spherical cameras and show in simulation experimentsthat robots can successfully explore complex articulated sceneswith our system.