Date of Original Version
Abstract or Description
In this paper, we propose a data driven approach to first person vision. We propose a novel image matching algorithm, named Re-Search, that is designed to cope with self repetitive structures and confusing patterns in the indoor environment. This algorithm uses state-of-art image search techniques, and it matches a query image with a two-pass strategy. In the first pass, a conventional image search algorithm is used to search for a small number of images that are most similar to the query image. In the second pass, the retrieval results from the first step are used to discover features that are more distinctive in the local context. We demonstrate and evaluate the Re-Search algorithm in the context of indoor localization, with the illustration of potential applications in object pop-out and data-driven zoom-in.
IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR) Workshop on Egocentric Vision.