Date of Original Version

5-2013

Type

Conference Proceeding

Abstract or Description

Abstract: We present an interactive perceptual skill for segmenting, tracking, and modeling the kinematic structure of 3D articulated objects. This skill is a prerequisite for general manipulation in unstructured environments. Robot-environment interactions are used to move an unknown object, creating a perceptual signal that reveals the kinematic properties of the object. The resulting perceptual information can then inform and facilitate further manipulation. The algorithm is computationally efficient, handles partial occlusions, and depends on little object motion; it only requires sufficient texture for visual feature tracking. We conducted experiments with everyday objects on a robotic manipulation platform equipped with an RGB-D sensor. The results demonstrate the robustness of the proposed method to lighting conditions, object appearance, size, structure, and configuration.

Included in

Robotics Commons

Share

COinS
 

Published In

Proceedings of IEEE International Conference on Robotics and Automation.