Date of Original Version
Abstract or Description
Gesture-Based Programming is a new paradigm to ease the burden of programming robots. By tapping in to the user’s wealth of experience with contact transitions, compliance, uncertainty and operations sequencing, we hope to provide a more intuitive programming environment for complex, real-world tasks based on the expressiveness of non-verbal communication. A requirement for this to be accomplished is the ability to interpret gestures to infer the intentions behind them. As a first step toward this goal, this paper presents an application of distributed perception for inferring a user’s intentions by observing tactile gestures. These gestures consist of sparse, inexact, physical “nudges” applied to the robot’s end effector for the purpose of modifying its trajectory in free space. A set of independent agents - each with its own local, fuzzified, heuristic model of a particular trajectory parameter - observes data from a wrist force/torque sensor to evaluate the gestures. The agents then independently determine the confidence of their respective findings and distributed arbitration resolves the interpretation through voting.