Learning to Reorient Objects with Stable Placements Afforded by Extrinsic Supports

05/14/2022
by   Peng Xu, et al.
0

Reorienting objects using extrinsic supporting items on the working platform is a meaningful nonetheless challenging manipulation task, considering the elaborate geometry of objects and the robot's motion planning. In this work, the robot learns to reorient objects through sequential pick-and-place operations according to sensing results from the RGBD camera. We propose generative models to predict objects' stable placements afforded by supporting items from observed point clouds. Then, we build manipulation graphs which enclose shared grasp configurations to connect objects' stable placements for pose transformation. We show in experiments that our method is effective and efficient. Simulation experiments demonstrate that the models can generalize to previously unseen pairs of objects started with random poses on the table. The calculated manipulation graphs are conducive to provide collision-free motions to reorient objects. We employ a robot in the real-world experiments to perform sequential pick-and-place operations, indicating our method is capable of transferring objects' placement poses in real scenes.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset