## CHIME: The Canadian Hydrogen Intensity Mapping Experiment

## Fast IRLS Algorithms for p-norm regression

## Some Statistical Results on Deep Learning: Interpolation, Optimality and Sparsity

This talk discusses three aspects of deep learning from a statistical perspective: interpolation, optimality and sparsity. The first one attempts to interpret the double descent phenomenon by precisely characterizing a U-shaped curve within the “over-fitting regime,” while the second one focuses on the statistical optimality of neural network classification in a student-teacher framework. This talk is concluded by proposing sparsity induced training of neural network with statistical guarantee.

## Effective bounds for the least solutions of homogeneous quadratic Diophantine inequalities

## Unitary, Symplectic, and Orthogonal Moments of Moments

The study of random matrix moments of moments has connections to number theory, combinatorics, and log-correlated fields. Our results give the leading order of these functions for integer moments parameters by exploiting connections with Gelfand-Tsetlin patterns and counts of lattice points in convex sets. This is joint work with Jon Keating and Theo Assiotis.

## Extreme eigenvalue distributions of sparse random graphs

## Deforming Holomorphic Chern-Simons at Large N

We describe the coupling of holomorphic Chern-Simons theory at large N with Kodaira-Spencer gravity. This gives a complete description of open-closed string field theory in the topological B-model. We explain an anomaly cancellation mechanism at all loops in perturbation theory in this model. At one loop this anomaly cancellation is analogous to the Green-Schwarz mechanism. This is joint work with Kevin Costello.

## Edward T Cone Concert Series Post Concert Discussion