Coupled Active Perception for Mobile Manipulation in Unknown Environments

Coupled Active Perception for Mobile Manipulation in Unknown Environments

We propose a new approach where the actively controllable cameras on the MoMa actively perceive the objects and makes an initial guess of the environment. Using this information, the mobile manipulator decides whether to perform exhaustive search on current location, move to next location for initial mapping, or to plan the steps for grasping. We hypothesize that such a coupled active perception reduces the mapping time and also assists the system in reliable grasping.

Thesis Degree Master
Thesis Advisors
Thesis Start Date 1/16/24
Thesis Description
Thesis Abstract

Background:

With service robots being increasingly deployed in unstructured environments like households, users typically only specify “get me the coffee cup” or “TV remote”, while their exact location and pose is unknown. For e.g., there are a table and cupboard where the coffee cup could be located. Table has clutter and needs to be viewed from a different direction to check if it contains the cup. Cupboard has no clutter. Based on the relative distance between them, mobile manipulators (MoMa) need to decide whether to map table or move to the cupboard. Additionally, for e.g., if the cup is found on the cupboard shelf, MoMa has to decide whether the current perception is sufficient for reliable grasping.

Research Gap:

Current approaches assume the destination is known and require separate phases for navigation to move to the destination, for mapping to localize the cup (target pose estimation), and for manipulation to pick up the target object (cup) [1,2]. Optimal base placement considering navigation and manipulation costs have been investigated for time-efficient mobile manipulation [3]. Recent approaches have also demonstrated the effectiveness of reactive base control for on the move picking of objects [4]. However, in this case, it is assumed that approximate pose of the target object is known beforehand.

Objective:

We propose a new approach where the actively controllable cameras on the MoMa actively perceive the objects and makes an initial guess of the environment. Using this information, the MoMa decides whether to perform exhaustive search on current location, move to next location for initial mapping, or to plan the steps for grasping. We hypothesize that such a coupled active perception reduces the mapping time and also assists the system in reliable grasping.

Scope of the thesis:

  • Implement a coupled active perception method for the mobile manipulator system
  • Optimize MoMa for grasping success and time to complete the task
  • Quantitative evaluation of coupled active perception mobile manipulation with baseline approaches
  • in simulation and on real robotic systems

Available Resources:

  • Multiple mobile manipulator platforms with ROS framework for navigation and manipulation control
  • and Gazebo simulation for initial testing
  • Baseline approaches for mapping and picking
  • Active perception approaches for semantic mapping of objects [5,6]

Thesis Image
Thesis Requirements
  • Enrolled in computer science or similar MSc program in and around Bonn/Cologne
  • Familiarity with mobile manipulation and 3D perception
  • Excellent academics and strong background in probability theory, linear algebra, and optimization.
  • Programming experience with C++, Python, and ROS (Robot Operating System).
  • Experience with reinforcement learning, and computer vision is a plus.
  • Enthusiasm for real-world robot deployment and scientific publishing of results
  • Ability to work independently as well as collaborate in a team
Lead Image
Lead Image Caption
Text
Settings
Versioning enabled yes
Short name coupled-active-perception-for-mobile-manipulation-in-unknown-environments-1
Layout
Blocks { "999cef4a-e62c-4864-b3cf-ba5ce15292ef": { "@type": "title" }, "ea0f47c0-a113-4221-8044-de685f506cc4": { "@type": "slate" } }
Blocks Layout { "items": [ "999cef4a-e62c-4864-b3cf-ba5ce15292ef", "ea0f47c0-a113-4221-8044-de685f506cc4" ] }
Categorization
Related Items
Contents

There are currently no items in this folder.