On the connections between optimization algorithms, Lyapunov functions, and differential equations: theory and insights

Paul Dobson, Jesus Maria Sanz-Serna, Konstantinos Zygalakis

Research output: Working paperPreprint

83 Downloads (Pure)

Abstract

We study connections between differential equations and optimization algorithms for $m$-strongly and $L$-smooth convex functions through the use of Lyapunov functions by generalizing the Linear Matrix Inequality framework developed by Fazylab et al. in 2018. Using the new framework we derive analytically a new (discrete) Lyapunov function for a two-parameter family of Nesterov optimization methods and characterize their convergence rate. This allows us to prove a convergence rate that improves substantially on the previously proven rate of Nesterov's method for the standard choice of coefficients, as well as to characterize the choice of coefficients that yields the optimal rate. We obtain a new Lyapunov function for the Polyak ODE and revisit the connection between this ODE and the Nesterov's algorithms. In addition discuss a new interpretation of Nesterov method as an additive Runge-Kutta discretization and explain the structural conditions that discretizations of the Polyak equation should satisfy in order to lead to accelerated optimization algorithms.
Original languageEnglish
DOIs
Publication statusPublished - 15 May 2023

Keywords

  • math.OC
  • cs.NA
  • math.NA
  • stat.ML
  • 65L06, 65L20, 90C25, 93C15

Fingerprint

Dive into the research topics of 'On the connections between optimization algorithms, Lyapunov functions, and differential equations: theory and insights'. Together they form a unique fingerprint.

Cite this