School of Mathematics

Boosting Simple Learners

Shay Moran
Google
May 5, 2020
We study boosting algorithms under the assumption that the given weak learner outputs hypotheses from a class of bounded capacity. This assumption is inspired by the common convention that weak hypotheses are “rules-of-thumbs” from an “easy-to-learn class”. Formally, we assume the class of weak hypotheses has a bounded VC dimension.

We focus on two main questions:

Local Statistics, Semidefinite Programming, and Community Detection

Prasad Raghavendra
University of California, Berkeley
May 4, 2020
We propose a new hierarchy of semidefinite programming relaxations for inference problems. As test cases, we consider the problem of community detection in block models. The vertices are partitioned into k communities, and a graph is sampled conditional on a prescribed number of inter- and intra-community edges.

Latent Stochastic Differential Equations for Irregularly-Sampled Time Series

David Duvenaud
University of Toronto
April 30, 2020
Much real-world data is sampled at irregular intervals, but most time series models require regularly-sampled data. Continuous-time models address this problem, but until now only deterministic (ODE) models or linear-Gaussian models were efficiently trainable with millions of parameters. We construct a scalable algorithm for computing gradients of samples from stochastic differential equations (SDEs), and for gradient-based stochastic variational inference in function space, all with the use of adaptive black-box SDE solvers.

Eulerianity of Fourier coefficients of automorphic forms

Henrik Gustafsson
Member, School of Mathematics
April 30, 2020
The factorization of Fourier coefficients of automorphic forms plays an important role in a wide range of topics, from the study of L-functions to the interpretation of scattering amplitudes in string theory. In this talk I will present a transfer theorem which derives the Eulerianity of certain Fourier coefficients from that of another coefficient. I will also discuss some applications of this theorem to Fourier coefficients of automorphic forms in minimal and next-to-minimal representations.

A Framework for Quadratic Form Maximization over Convex Sets

Vijay Bhattiprolu
Member, School of Mathematics
April 28, 2020
We investigate the approximability of the following optimization problem, whose input is an
n-by-n matrix A and an origin symmetric convex set C that is given by a membership oracle:
"Maximize the quadratic form as x ranges over C."

This is a rich and expressive family of optimization problems; for different choices of forms A
and convex bodies C it includes a diverse range of interesting combinatorial and continuous
optimization problems. To name some examples, max-cut, Grothendieck's inequality, the

Ellipses of small eccentricity are determined by their Dirichlet (or, Neumann) spectra

Steven Morris Zelditch
Northwestern University
April 28, 2020
In 1965, M. Kac proved that discs were uniquely determined by their Dirichlet (or, Neumann) spectra. Until recently, disks were the only smooth plane domains known to be determined by their eigenvalues. Recently, H. Hezari and I proved that ellipses of small eccentricity are also determined uniquely by their Dirichlet (or, Neumann) spectra. The proof uses recent results of Avila, de Simoi, and Kaloshin, proving that nearly circular plane domains with rationally integrable billiards must be ellipses.

Graph and Hypergraph Sparsification

Luca Trevisan
Bocconi University
April 27, 2020
A weighted graph H is a sparsifier of a graph G if H has much fewer edges than G and, in an appropriate technical sense, H "approximates" G. Sparsifiers are useful as compressed representations of graphs and to speed up certain graph algorithms. In a "cut sparsifier," the notion of approximation is that every cut is crossed by approximately the same number of edges in G as in H. In a "spectral sparsifier" a stronger, linear-algebraic, notion of approximation holds. Similar definitions can be given for hypergraphs.

The Geography of Immersed Lagrangian Fillings of Legendrian Submanifolds

Lisa Traynor
April 24, 2020
Given a smooth knot K in the 3-sphere, a classic question in knot theory is: What surfaces in the 4-ball have boundary equal to K? One can also consider immersed surfaces and ask a “geography” question: What combinations of genus and double points can be realized by surfaces with boundary equal to K? I will discuss symplectic analogues of these questions: Given a Legendrian knot, what Lagrangian surfaces can it bound? What immersed Lagrangian surfaces can it bound?

Deep Generative models and Inverse Problems

Alexandros Dimakis
University of Texas at Austin
April 23, 2020
Modern deep generative models like GANs, VAEs and invertible flows are showing amazing results on modeling high-dimensional distributions, especially for images. We will show how they can be used to solve inverse problems by generalizing compressed sensing beyond sparsity. We will present the general framework, new results and open problems in this space.