Date of Original Version

5-2015

Type

Conference Proceeding

Rights Management

© 2015 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.

Abstract or Description

Modeling the effects of actions based on the state of the world enables robots to make intelligent decisions in different situations. However, it is often infeasible to have globally accurate models. Task performance is often hindered by discrepancies between models and the real world, since the true outcome of executing a plan may be significantly worse than the expected outcome used during planning. Furthermore, expectations about the world are often stochastic in robotics, making the discovery of model-world discrepancies non-trivial. We present an execution monitoring framework capable of finding statistically significant discrepancies, determining the situations in which they occur, and making simple corrections to the world model to improve performance. In our approach, plans are initially based on a model of the world that is only as faithful as computational and algorithmic limitations allow. Through experience, the monitor discovers previously unmodeled modes of the world, defined as regions of a feature space in which the experienced outcome of a plan deviates significantly from the predicted outcome. The monitor may then make suggestions to change the model to match the real world more accurately. We demonstrate this approach on the adversarial domain of robot soccer: we monitor pass interception performance of potentially unknown opponents to try to find unforeseen modes of behavior that affect their interception performance.

DOI

10.1109/ICRA.2015.7139646

Share

COinS
 

Published In

Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), 2015, 3247-3252.