This paper presents a rigorous statistical analysis characterizing regimes in which active learning significantly outperforms classical passive learning. Active learning algorithms are able to make queries or select sample locations in an online fashion, depending on the results of the previous queries. In some regimes, this extra flexibility leads to significantly faster rates of error decay than those possible in classical passive learning settings. The nature of these regimes is explored by studying fundamental performance limits of active and passive learning in two illustrative nonparametric function classes. In addition to examining the theoretical potential of active learning, this paper describes a practical algorithm capable of exploiting the extra flexibility of the active setting and provably improving upon the classical passive techniques. Our active learning theory and methods show promise in a number of applications, including field estimation using wireless sensor networks and fault line detection.
|Title of host publication||Advances in Neural Information Processing Systems 18 (NIPS 2005), Vancouver BC, Canada, December 5-8, 2005|
|Publication status||Published - 2005|
|Event||conference; NIPS 2005 - |
Duration: 1 Jan 2005 → …
|Conference||conference; NIPS 2005|
|Period||1/01/05 → …|