Accelerating optimization over the space of probability measures

Shi Chen, Qin Li, Oliver Tse, Stephen J. Wright

Research output: Working paperPreprintAcademic

65 Downloads (Pure)

Abstract

Acceleration of gradient-based optimization methods is an issue of significant practical and theoretical interest, particularly in machine learning applications. Most research has focused on optimization over Euclidean spaces, but given the need to optimize over spaces of probability measures in many machine learning problems, it is of interest to investigate accelerated gradient methods in this context too. To this end, we introduce a Hamiltonian-flow approach that is analogous to moment-based approaches in Euclidean space. We demonstrate that algorithms based on this approach can achieve convergence rates of arbitrarily high order. Numerical examples illustrate our claim.
Original languageEnglish
PublisherarXiv.org
Pages1-35
Number of pages35
Volume2310.04006
DOIs
Publication statusPublished - 9 Oct 2023

Keywords

  • math.OC
  • cs.LG

Fingerprint

Dive into the research topics of 'Accelerating optimization over the space of probability measures'. Together they form a unique fingerprint.

Cite this