Samenvatting
Acceleration of gradient-based optimization methods is an issue of significant practical and theoretical interest, particularly in machine learning applications. Most research has focused on optimization over Euclidean spaces, but given the need to optimize over spaces of probability measures in many machine learning problems, it is of interest to investigate accelerated gradient methods in this context too. To this end, we introduce a Hamiltonian-flow approach that is analogous to moment-based approaches in Euclidean space. We demonstrate that algorithms based on this approach can achieve convergence rates of arbitrarily high order. Numerical examples illustrate our claim.
Originele taal-2 | Engels |
---|---|
Uitgever | arXiv.org |
Pagina's | 1-35 |
Aantal pagina's | 35 |
Volume | 2310.04006 |
DOI's | |
Status | Gepubliceerd - 9 okt. 2023 |