Recently Added

A Framework for Quadratic Form Maximization over Convex Sets

Vijay Bhattiprolu
Member, School of Mathematics
April 28, 2020
We investigate the approximability of the following optimization problem, whose input is an
n-by-n matrix A and an origin symmetric convex set C that is given by a membership oracle:
"Maximize the quadratic form as x ranges over C."

This is a rich and expressive family of optimization problems; for different choices of forms A
and convex bodies C it includes a diverse range of interesting combinatorial and continuous
optimization problems. To name some examples, max-cut, Grothendieck's inequality, the

Twisted M-theory and Holography

Davide Gaiotto
Perimeter Institute
April 27, 2020
A few years ago, K.Costello proposed how to isolate a self-contained, protected subsector of the M2 brane AdS4/CFT3 correspondence and demonstrated the holographic duality explicitly for the OPE of the corresponding boundary local operators. The construction can be naturally extended to protected correlation functions, but several new challenges arise. I will discuss how to take the large N limit on the CFT3 side of the story.

Graph and Hypergraph Sparsification

Luca Trevisan
Bocconi University
April 27, 2020
A weighted graph H is a sparsifier of a graph G if H has much fewer edges than G and, in an appropriate technical sense, H "approximates" G. Sparsifiers are useful as compressed representations of graphs and to speed up certain graph algorithms. In a "cut sparsifier," the notion of approximation is that every cut is crossed by approximately the same number of edges in G as in H. In a "spectral sparsifier" a stronger, linear-algebraic, notion of approximation holds. Similar definitions can be given for hypergraphs.

The Geography of Immersed Lagrangian Fillings of Legendrian Submanifolds

Lisa Traynor
April 24, 2020
Given a smooth knot K in the 3-sphere, a classic question in knot theory is: What surfaces in the 4-ball have boundary equal to K? One can also consider immersed surfaces and ask a “geography” question: What combinations of genus and double points can be realized by surfaces with boundary equal to K? I will discuss symplectic analogues of these questions: Given a Legendrian knot, what Lagrangian surfaces can it bound? What immersed Lagrangian surfaces can it bound?

Deep Generative models and Inverse Problems

Alexandros Dimakis
University of Texas at Austin
April 23, 2020
Modern deep generative models like GANs, VAEs and invertible flows are showing amazing results on modeling high-dimensional distributions, especially for images. We will show how they can be used to solve inverse problems by generalizing compressed sensing beyond sparsity. We will present the general framework, new results and open problems in this space.

Geodesically Convex Optimization (or, can we prove P!=NP using gradient descent)

Avi Wigderson
Herbert H. Maass Professor, School of Mathematics
April 21, 2020
This talk aims to summarize a project I was involved in during the past 5 years, with the hope of explaining our most complete understanding so far, as well as challenges and open problems. The main messages of this project are summarized below; I plan to describe, through examples, many of the concepts they refer to, and the evolution of ideas leading to them. No special background is assumed.

Interpretability for Everyone

Been Kim
April 16, 2020
In this talk, I would like to share some of my reflections on the progress made in the field of interpretable machine learning. We will reflect on where we are going as a field, and what are the things that we need to be aware of to make progress. With that perspective, I will then discuss some of my work on 1) sanity checking popular methods and 2) developing more lay person-friendly interpretability methods. I will also share some open theoretical questions that may help us move forward.

Steps towards more human-like learning in machines

Josh Tenenbaum
April 16, 2020
There are several broad insights we can draw from computational models of human cognition in order to build more human-like forms of machine learning. (1) The brain has a great deal of built-in structure, yet still tremendous need and potential for learning. Instead of seeing built-in structure and learning as in tension, we should be thinking about how to learn effectively with more and richer forms of structure. (2) The most powerful forms of human knowledge are symbolic and often causal and probabilistic.