TITLE: LP, SOCP, and optimization-free approaches to sum of squares optimization

 

ABSTRACT:

The problem of optimizing over the cone of nonnegative polynomials is a fundamental problem in computational mathematics, with applications to polynomial optimization, control, machine learning, game theory, and combinatorics, among others. A number of breakthrough papers in the early 2000s showed that this problem, long thought to be intractable, could be solved by using sum of squares programming. This technique however has proved to be expensive for large-scale problems, as it involves solving large semidefinite programs (SDPs).


In the first part of this talk, we present two methods for approximately solving large-scale sum of squares programs that dispense altogether with semidefinite programming and only involve solving a sequence of linear or second order cone programs generated in an adaptive fashion. In the second part of the talk, we focus on the problem of finding tight lower bounds on polynomial optimization problems (POPs), a fundamental task in this area that is most commonly handled through the use of SDP-based sum of squares hierarchies (e.g., due to Lasserre and Parrilo). In contrast to previous approaches, we provide the first theoretical framework for efficiently constructing converging hierarchies of lower bounds on POPs whose computation does not require any optimization, but simply the ability to multiply certain fixed polynomials together and check nonnegativity of the coefficients of the product.

 

BIO: Georgina Hall is a final-year PhD student and a Gordon Y. S. Wu fellow in the department of Operations Research and Financial Engineering at Princeton University. Her advisor is Prof. Amir Ali Ahmadi. She received her Bachelor of Science from Ecole Centrale Paris, France, in 2011 and her Master of Science from the same university in 2013.