RT Journal Article
SR Electronic
T1 Efficient nonparametric belief propagation for pose estimation and manipulation of articulated objects
JF Science Robotics
JO Sci. Robotics
FD American Association for the Advancement of Science
SP eaaw4523
DO 10.1126/scirobotics.aaw4523
VO 4
IS 30
A1 Desingh, Karthik
A1 Lu, Shiyang
A1 Opipari, Anthony
A1 Jenkins, Odest Chadwicke
YR 2019
UL http://robotics.sciencemag.org/content/4/30/eaaw4523.abstract
AB Robots working in human environments often encounter a wide range of articulated objects, such as tools, cabinets, and other jointed objects. Such articulated objects can take an infinite number of possible poses, as a point in a potentially high-dimensional continuous space. A robot must perceive this continuous pose to manipulate the object to a desired pose. This problem of perception and manipulation of articulated objects remains a challenge due to its high dimensionality and multimodal uncertainty. Here, we describe a factored approach to estimate the poses of articulated objects using an efficient approach to nonparametric belief propagation. We consider inputs as geometrical models with articulation constraints and observed RGBD (red, green, blue, and depth) sensor data. The described framework produces object-part pose beliefs iteratively. The problem is formulated as a pairwise Markov random field (MRF), where each hidden node (continuous pose variable) is an observed object-part’s pose and the edges denote the articulation constraints between the parts. We describe articulated pose estimation by a “pull” message passing algorithm for nonparametric belief propagation (PMPNBP) and evaluate its convergence properties over scenes with articulated objects. Robot experiments are provided to demonstrate the necessity of maintaining beliefs to perform goal-driven manipulation tasks.