Abstract
We propose CDONE, a convex version of the DONE algorithm. DONE is a derivative-free online optimization algorithm that uses surrogate modeling with noisy measurements to find a minimum of objective functions that are expensive to evaluate. Inspired by their success in deep learning, CDONE makes use of rectified linear units, together with a nonnegativity constraint to enforce convexity of the surrogate model. This leads to a sparse and cheap to evaluate surrogate model of the unknown optimization objective that is still accurate and that can be minimized with convex optimization algorithms. The CDONE algorithm is demonstrated on a toy example and on the problem of hyper-parameter optimization for a deep learning example on handwritten digit classification.
Original language | English |
---|---|
Title of host publication | 2017 IEEE International Workshop on Machine Learning for Signal Processing (MLSP) |
Editors | Naonori Ueda, Jen-Tzung Chien, Tomoko Matsui, Jan Larsen, Shinji Watanabe |
Publisher | Institute of Electrical and Electronics Engineers |
Pages | 1-6 |
Number of pages | 6 |
ISBN (Electronic) | 978-1-5090-6341-3 |
DOIs | |
Publication status | Published - 7 Dec 2017 |
Externally published | Yes |
Event | 27th IEEE International Workshop on Machine Learning for Signal Processing, MLSP 2017 - Tokyo, Japan Duration: 25 Sept 2017 → 28 Sept 2017 Conference number: 27 |
Conference
Conference | 27th IEEE International Workshop on Machine Learning for Signal Processing, MLSP 2017 |
---|---|
Abbreviated title | MLSP 2017 |
Country/Territory | Japan |
City | Tokyo |
Period | 25/09/17 → 28/09/17 |
Keywords
- Bayesian optimization
- Deep learning
- Derivative-free optimization
- Surrogate modeling