Some Statistical Results on Deep Learning: Interpolation, Optimality and Sparsity

Guang Cheng
Purdue University; Member, School of Mathematics
November 13, 2019

This talk discusses three aspects of deep learning from a statistical perspective: interpolation, optimality and sparsity. The first one attempts to interpret the double descent phenomenon by precisely characterizing a U-shaped curve within the “over-fitting regime,” while the second one focuses on the statistical optimality of neural network classification in a student-teacher framework. This talk is concluded by proposing sparsity induced training of neural network with statistical guarantee. 

Unitary, Symplectic, and Orthogonal Moments of Moments

Emma Bailey
University of Bristol
November 15, 2019

The study of random matrix moments of moments has connections to number theory, combinatorics, and log-correlated fields. Our results give the leading order of these functions for integer moments parameters by exploiting connections with Gelfand-Tsetlin patterns and counts of lattice points in convex sets. This is joint work with Jon Keating and Theo Assiotis. 

Deforming Holomorphic Chern-Simons at Large N

Si Li
Member, School of Natural Sciences, IAS; Tsinghua University
November 15, 2019

We describe the coupling of holomorphic Chern-Simons theory at large N with Kodaira-Spencer gravity. This gives a complete description of open-closed string field theory in the topological B-model. We explain an anomaly cancellation mechanism at all loops in perturbation theory in this model. At one loop this anomaly cancellation is analogous to the Green-Schwarz mechanism. This is joint work with Kevin Costello.