Date of Original Version

7-7-2012

Type

Article

Journal Title

Machine Learning

Volume

90

Issue

2

First Page

161

Last Page

189

Rights Management

The final publication is available at Springer via http://dx.doi.org/10.1007/s10994-012-5310-y

Abstract or Description

We explore a transfer learning setting, in which a finite sequence of target concepts are sampled independently with an unknown distribution from a known family. We study the total number of labeled examples required to learn all targets to an arbitrary specified expected accuracy, focusing on the asymptotics in the number of tasks and the desired accuracy. Our primary interest is formally understanding the fundamental benefits of transfer learning, compared to learning each target independently from the others. Our approach to the transfer problem is general, in the sense that it can be used with a variety of learning protocols. As a particularly interesting application, we study in detail the benefits of transfer for self-verifying active learning; in this setting, we find that the number of labeled examples required for learning with transfer is often significantly smaller than that required for learning each target independently.

DOI

10.1007/s10994-012-5310-y

Share

COinS
 

Published In

Machine Learning, 90, 2, 161-189.