Sparse least trimmed squares regression for analyzing high-dimensional large data sets

A. Alfons, C. Croux, S.E.C. Gelper

Research output: Contribution to journalArticleAcademicpeer-review

150 Citations (Scopus)
119 Downloads (Pure)

Abstract

Sparse model estimation is a topic of high importance in modern data analysis due to the increasing availability of data sets with a large number of variables. Another common problem in applied statistics is the presence of outliers in the data. This paper combines robust regression and sparse model estimation. A robust and sparse estimator is introduced by adding an L 1 penalty on the coefficient estimates to the well-known least trimmed squares (LTS) estimator. The breakdown point of this sparse LTS estimator is derived, and a fast algorithm for its computation is proposed. In addition, the sparse LTS is applied to protein and gene expression data of the NCI-60 cancer cell panel. Both a simulation study and the real data application show that the sparse LTS has better prediction performance than its competitors in the presence of leverage points.
Original languageEnglish
Pages (from-to)226-248
Number of pages23
JournalThe Annals of Applied Statistics
Volume7
Issue number1
DOIs
Publication statusPublished - 2013
Externally publishedYes

Fingerprint

Dive into the research topics of 'Sparse least trimmed squares regression for analyzing high-dimensional large data sets'. Together they form a unique fingerprint.

Cite this