Link Search Menu Expand Document

A Long Horizon Planning Framework for Manipulating Rigid Pointcloud Objects

Paper PDF


Anthony Simeonov (Massachusetts Institute of Technology)*; Yilun Du (MIT); Beomjoon Kim (MIT); Francois Hogan (MIT); Joshua Tenenbaum (MIT); Pulkit Agrawal (UC Berkeley); Alberto Rodriguez (MIT)

Interactive Session

2020-11-17, 12:30 - 13:00 PST | PheedLoop Session


We present a framework for solving long-horizon planning problems involving manipulation of rigid objects that operates directly from a point-cloud observation. Our method plans in the space of object subgoals and frees the planner from reasoning about robot-object interaction dynamics. We show that for rigid-bodies, this abstraction can be realized using low-level manipulation skills that maintain sticking-contact with the object and represent subgoals as 3D transformations. To enable generalization to unseen objects and improve planning performance, we propose a novel way of representing subgoals for rigid-body manipulation and a graph-attention based neural network architecture for processing point-cloud inputs. We experimentally validate these choices using simulated and real-world experiments on the YuMi robot. Results demonstrate that our method can successfully manipulate new objects into target configurations requiring long-term planning. Overall, our framework realizes the best of the worlds of task-and-motion planning (TAMP) and learning-based approaches. Project website:


Reviews and Rebuttal

Reviews & Rebuttal

Conference on Robot Learning 2020