Faster rates in regression via active learning

R.M. Castro, R. Willett, R. Nowak

    Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

    Abstract

    This paper presents a rigorous statistical analysis characterizing regimes in which active learning significantly outperforms classical passive learning. Active learning algorithms are able to make queries or select sample locations in an online fashion, depending on the results of the previous queries. In some regimes, this extra flexibility leads to significantly faster rates of error decay than those possible in classical passive learning settings. The nature of these regimes is explored by studying fundamental performance limits of active and passive learning in two illustrative nonparametric function classes. In addition to examining the theoretical potential of active learning, this paper describes a practical algorithm capable of exploiting the extra flexibility of the active setting and provably improving upon the classical passive techniques. Our active learning theory and methods show promise in a number of applications, including field estimation using wireless sensor networks and fault line detection.
    Original languageEnglish
    Title of host publicationAdvances in Neural Information Processing Systems 18 (NIPS 2005), Vancouver BC, Canada, December 5-8, 2005
    Publication statusPublished - 2005
    Eventconference; NIPS 2005 -
    Duration: 1 Jan 2005 → …

    Conference

    Conferenceconference; NIPS 2005
    Period1/01/05 → …
    OtherNIPS 2005

    Fingerprint

    Dive into the research topics of 'Faster rates in regression via active learning'. Together they form a unique fingerprint.

    Cite this