Abstract: We will demonstrate a wide range of first-order optimization methods that can
be built from a few concepts and tricks in the monotone operator theory. They
include gradient, proximal-gradient, (proximal) method of multipliers, alternating
minimization, PDHG, Chambolle Pock, Condat-Vu, (standard, proximal, and
linearized) ADMM, and PD3O. Finite-sum and block-coordinate-friendly properties
are used to develop parallel and asynchronous methods. However, we will
leave out topics such as line search, Nesterov/heavy-ball accelerations, conditional
gradients, and second-order methods. We will also discuss how to recognize
unbounded, infeasible, and other pathological problems by first-order methods.
Co-author: Ernest K. Ryu and Yanli LI.
BIO: Wotao Yin
Affiliation: University of California; Los Angeles