Fast training of object detection using stochastic gradient descent

R.G.J. Wijnhoven, P.H.N. With, de

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

42 Citations (Scopus)
243 Downloads (Pure)

Abstract

Training datasets for object detection problems are typically very large and Support Vector Machine (SVM) implementations are computationally complex. As opposed to these complex techniques, we use Stochastic Gradient Descent (SGD) algorithms that use only a single new training sample in each iteration and process samples in a stream-like fashion. We have incorporated SGD optimization in an object detection framework. The object detection problem is typically highly asymmetric, because of the limited variation in object appearance, compared to the background. Incorporating SGD speeds up the optimization process significantly, requiring only a single iteration over the training set to obtain results comparable to state-of-the-art SVM techniques. SGD optimization is linearly scalable in time and the obtained speedup in computation time is two to three orders of magnitude. We show that by considering only part of the total training set, SGD converges quickly to the overall optimum.
Original languageEnglish
Title of host publicationProceedings of the 20th International Conference on Pattern Recognition (ICPR), 23-26 August 2010, Istanbul, Turkey
Place of PublicationPiscataway
PublisherInstitute of Electrical and Electronics Engineers
Pages424-427
ISBN (Print)978-1-4244-7542-1
DOIs
Publication statusPublished - 2010

Fingerprint

Dive into the research topics of 'Fast training of object detection using stochastic gradient descent'. Together they form a unique fingerprint.

Cite this