This talk discusses three aspects of deep learning from a statistical perspective: interpolation, optimality and sparsity. The first one attempts to interpret the double descent phenomenon by precisely characterizing a U-shaped curve within the “over-fitting regime,” while the second one focuses on the statistical optimality of neural network classification in a student-teacher framework. This talk is concluded by proposing sparsity induced training of neural network with statistical guarantee.
The study of random matrix moments of moments has connections to number theory, combinatorics, and log-correlated fields. Our results give the leading order of these functions for integer moments parameters by exploiting connections with Gelfand-Tsetlin patterns and counts of lattice points in convex sets. This is joint work with Jon Keating and Theo Assiotis.
We describe the coupling of holomorphic Chern-Simons theory at large N with Kodaira-Spencer gravity. This gives a complete description of open-closed string field theory in the topological B-model. We explain an anomaly cancellation mechanism at all loops in perturbation theory in this model. At one loop this anomaly cancellation is analogous to the Green-Schwarz mechanism. This is joint work with Kevin Costello.