TITLE: On the Equivalence of Simulated Annealing & Interior Point Path Following for Optimization
ABSTRACT:
A well-studied deterministic algorithmic technique for convex optimization is the class of so-called Interior Point Methods of Nesterov and Nemirovski, which involve taking a sequence of Newton steps along the "central path" towards the optimum. An alternative randomized method, known as simulated annealing, involves performing a random walk around the set while "cooling" the stationary distribution towards the optimum. We will show that these two methods are, in a certain sense, fully equivalent: both techniques can be viewed as different types of path following. This equivalence allows us to get an improved state-of-the-art rate for simulated annealing, and provides a new understanding of random walk methods using barrier functions.
BIO: Jacob Abernethy is Assistant Professor in Computer Science at Georgia Tech. He started his faculty career in the Department of Electrical Engineering and Computer Science at the University of Michigan. In October 2011 he finished a PhD in the Division of Computer Science at the University of California at Berkeley, and then spent nearly two years as a Simons postdoctoral fellow at the CIS department at UPenn, working with Michael Kearns. Abernethy's primary interest is in Machine Learning, with a particular focus in sequential decision making, online learning, online algorithms and adversarial learning models. He did his Master's degree at TTI-C, and his Bachelor's Degree at MIT. Abernethy's PhD advisor is Prof. Peter Bartlett