PUGS: Perceptual Uncertainty for Grasp Selection.

1University of Michigan,

2University of Washington Applied Physics Lab

We present PUGS, an occupancy uncertainty estimation method used for underwater grasp selection.

PUGS

Abstract

When navigating and interacting in challenging environments where sensory information is imperfect and incomplete, robots must make decisions that account for these shortcomings. We propose a novel method for quantifying and representing such perceptual uncertainty in 3D reconstruction through occupancy uncertainty estimation. We develop a framework to incorporate it into grasp selection for autonomous manipulation in underwater environments. Instead of treating each measurement equally when deciding which location to grasp from, we present a framework that propagates uncertainty inherent in the multi-view reconstruction process into the grasp selection. We evaluate our method with both simulated and the real world data, showing that by accounting for uncertainty, the grasp selection becomes robust against partial and noisy measurements.

Overview

We focus on modeling how uncertainty inherent from multi-view stereo can be leveraged for quantifying uncertainty in 3D reconstruction, specifically in representing occupancy in 3D space. These uncertainties of the occupied regions can then be a useful for improving existing grasp selection methods and guiding toward more reliable and robust grasp selection We propose the construction of a fused occupancy field (FOF) informed by the uncertainty in measurements and pose estimates. We then develop a novel method to quantify the predictive uncertainty associated with occupancy in 3D space using probabilistic regression methods. A fusion mechanism is developed to combine information from measurement and predictive uncertainty for modeling occupancy uncertainty. We provide an experimental evaluation in both simulation and real-world underwater environments to validate the proposed methods.

PUGS Overview