The Non-Stochastic Control Problem

Elad Hazan
Princeton University
May 18, 2020
Linear dynamical systems are a continuous subclass of reinforcement learning models that are widely used in robotics, finance, engineering, and meteorology. Classical control, since the work of Kalman, has focused on dynamics with Gaussian i.i.d. noise, quadratic loss functions and, in terms of provably efficient algorithms, known systems and observed state. We'll discuss how to apply new machine learning methods which relax all of the above: efficient control with adversarial noise, general loss functions, unknown systems, and partial observation.

Reflections on Cylindrical Contact Homology

Jo Nelson
Rice University
May 15, 2020
This talk beings with a light introduction, including some historical anecdotes to motivate the development of this Floer theoretic machinery for contact manifolds some 25 years ago. I will discuss joint work with Hutchings which constructs nonequivariant and a family Floer equivariant version of contact homology. Both theories are generated by two copies of each Reeb orbit over Z and capture interesting torsion information.

Entanglement Entropy in Flat Holography

Wei Song
Member, School of Natural Sciences, Institute for Advanced Study; Tsinghua University
May 15, 2020
The appearance of BMS symmetry as the asymptotic symmetry of Minkowski spacetime suggests a holographic relation between Einstein gravity and quantum field theory with BMS invariance, dubbed BMSFT. With a three dimensional bulk, the dual BMSFT is a non-Lorentz invariant, two dimensional field theory with infinite-dimensional symmetries. In this talk, I will argue that entanglement entropy in BMSFT can be described by a swing surface in the bulk.

Will I Have to Mortgage My House? Reflections on Gene Therapy, Innovation, and Inequality

Eben Kirksey
Friends of the Institute Member, School of Social Science
May 15, 2020
The first FDA-approved gene therapy, Kymriah, was released to the public in August 2017 with a $475,000 price tag. With the emergence of personalized genetic medicine, we are entering a new era of profound inequality. This talk explores the stories of children and parents who signed up for the Kymriah clinical trial before it was approved--risking their lives and household finances in pursuit of a cancer cure. Issues of race and class played out at Penn Medicine, as researchers explored new horizons of hope with living cellular therapies.

MathZero, The Classification Problem, and Set-Theoretic Type Theory

David McAllester
Toyota Technological Institute at Chicago
May 14, 2020
AlphaZero learns to play go, chess and shogi at a superhuman level through self play given only the rules of the game. This raises the question of whether a similar thing could be done for mathematics --- a MathZero. MathZero would require a formal foundation and an objective. We propose the foundation of set-theoretic dependent type theory and an objective defined in terms of the classification problem --- the problem of classifying concept instances up to isomorphism. Isomorphism is central to the structure of mathematics.

Convex Set Disjointness, Distributed Learning of Halfspaces, and Linear Programming

Shay Moran
Member, School of Mathematics
May 12, 2020
Distributed learning protocols are designed to train on distributed data without gathering it all on a single centralized machine, thus contributing to the efficiency of the system and enhancing its privacy. We study a central problem in distributed learning, called Distributed Learning of Halfspaces: let U \subset R^d be a known domain of size n and let h:R^d —> R be an unknown target affine function. A set of examples {(u,b)} is distributed between several parties, where u \in U is a point and b = sign(h(u)) \in {-1, +1} is its label.

Quantitative decompositions of Lipschitz mappings

Guy C. David
Ball State University
May 12, 2020
Given a Lipschitz map, it is often useful to chop the domain into pieces on which the map has simple behavior. For example, depending on the dimensions of source and target, one may ask for pieces on which the map behaves like a bi-Lipschitz embedding or like a linear projection. For many issues, it is even more useful if this decomposition is quantitative, i.e., with bounds independent of the particular map or spaces involved.

Generative Modeling by Estimating Gradients of the Data Distribution

Stefano Ermon
Stanford University
May 12, 2020
Existing generative models are typically based on explicit representations of probability distributions (e.g., autoregressive or VAEs) or implicit sampling procedures (e.g., GANs). We propose an alternative approach based on modeling directly the vector field of gradients of the data distribution (scores). Our framework allows flexible energy-based model architectures, requires no sampling during training or the use of adversarial training methods.