Online Optimization with Costly and Noisy Measurements Using Random Fourier Expansions

Laurens Bliek, Hans R.G.W. Verstraete, Michel Verhaegen, Sander Wahls

Research output: Contribution to journalArticleAcademicpeer-review

33 Citations (Scopus)

Abstract

This paper analyzes data-based online nonlinear extremum-seeker (DONE), an online optimization algorithm that iteratively minimizes an unknown function based on costly and noisy measurements. The algorithm maintains a surrogate of the unknown function in the form of a random Fourier expansion. The surrogate is updated whenever a new measurement is available, and then used to determine the next measurement point. The algorithm is comparable with Bayesian optimization algorithms, but its computational complexity per iteration does not depend on the number of measurements. We derive several theoretical results that provide insight on how the hyperparameters of the algorithm should be chosen. The algorithm is compared with a Bayesian optimization algorithm for an analytic benchmark problem and three applications, namely, optical coherence tomography, optical beam-forming network tuning, and robot arm control. It is found that the DONE algorithm is significantly faster than Bayesian optimization in the discussed problems while achieving a similar or better performance.

Original languageEnglish
Article number7728083
Pages (from-to)167-182
Number of pages16
JournalIEEE Transactions on Neural Networks and Learning Systems
Volume29
Issue number1
DOIs
Publication statusPublished - Jan 2018
Externally publishedYes

Keywords

  • Adaptive optics
  • Bayesian optimization
  • derivative-free optimization (DFO)
  • learning systems
  • surrogate model

Fingerprint

Dive into the research topics of 'Online Optimization with Costly and Noisy Measurements Using Random Fourier Expansions'. Together they form a unique fingerprint.

Cite this