Statistical problems and convex relaxations: Dreams of a general theory

Tselil Schramm

(Harvard University)


Please LOG IN to view the video.

Date: February 19, 2019

Description:

In statistics, the difficulty of algorithmic problems depends on properties of the data: when the data contains less information, the problem becomes harder to solve. In many statistical settings, there is an apparent gap between what may be computed information-theoretically and what can be computed efficiently. Characterizing this relationship between information and computation is a central question in statistical learning, cryptography and more. But the traditional theory of computing is designed to study worst-case inputs which may not resemble data, and is not well-suited to the statistical setting.

In this talk, I will describe my work towards building a theory of computation for statistical models using convex relaxations (and particularly the powerful sum-of-squares algorithm). I’ll talk about my efforts in understanding the tradeoff between computation and statistical information, in giving fast (often linear- or almost-linear-time) algorithms based on slower semidefinite programs, and in making a broad connection between convex programs and spectral algorithms for statistical problems.




Created: Wednesday, February 20th, 2019