Date of Original Version

11-2013

Type

Conference Proceeding

Rights Management

© 2013 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.

Abstract or Description

We present a vision-based mapping and localization system for operations in pipes such as those found in Liquified Natural Gas (LNG) production. A forward facing fisheye camera mounted on a prototype robot collects imagery as it is teleoperated through a pipe network. The images are processed offline to estimate camera pose and sparse scene structure where the results can be used to generate 3D renderings of the pipe surface. The method extends state of the art visual odometry and mapping for fisheye systems to incorporate geometric constraints based on prior knowledge of the pipe components into a Sparse Bundle Adjustment framework. These constraints significantly reduce inaccuracies resulting from the limited spatial resolution of the fisheye imagery, limited image texture, and visual aliasing. Preliminary results are presented for datasets collected in our fiberglass pipe network which demonstrate the validity of the approach.

DOI

10.1109/IROS.2013.6697105

Included in

Robotics Commons

Share

COinS
 

Published In

Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2013, 5180-5185.