## The singular set in the fully nonlinear obstacle problem

For the Obstacle Problem involving a convex fully nonlinear elliptic operator, we show that the singular set of the free boundary stratifies. The top stratum is locally covered by a $C^{1,\alpha}$-manifold, and the lower strata are covered by $C^{1,\log^\eps}$-manifolds. This essentially recovers the regularity result obtained by Figalli-Serra when the operator is the Laplacian.

## An isoperimetric inequality for the Hamming cube and some consequences

I will introduce an isoperimetric inequality for the Hamming cube and some of its applications. The applications include a “stability” version of Harper’s edge-isoperimetric inequality, which was first proved by Friedgut, Kalai and Naor for half cubes, and later by Ellis for subsets of any size. Our inequality also plays a key role in a recent result on the asymptotic number of maximal independent sets in the cube.

This is joint work with Jeff Kahn.

## TMF and SQFT

## Effective bounds for the least solutions of homogeneous quadratic Diophantine inequalities

## Unitary, Symplectic, and Orthogonal Moments of Moments

The study of random matrix moments of moments has connections to number theory, combinatorics, and log-correlated fields. Our results give the leading order of these functions for integer moments parameters by exploiting connections with Gelfand-Tsetlin patterns and counts of lattice points in convex sets. This is joint work with Jon Keating and Theo Assiotis.

## Extreme eigenvalue distributions of sparse random graphs

## Deforming Holomorphic Chern-Simons at Large N

We describe the coupling of holomorphic Chern-Simons theory at large N with Kodaira-Spencer gravity. This gives a complete description of open-closed string field theory in the topological B-model. We explain an anomaly cancellation mechanism at all loops in perturbation theory in this model. At one loop this anomaly cancellation is analogous to the Green-Schwarz mechanism. This is joint work with Kevin Costello.

## Edward T Cone Concert Series Post Concert Discussion

## Some Statistical Results on Deep Learning: Interpolation, Optimality and Sparsity

This talk discusses three aspects of deep learning from a statistical perspective: interpolation, optimality and sparsity. The first one attempts to interpret the double descent phenomenon by precisely characterizing a U-shaped curve within the “over-fitting regime,” while the second one focuses on the statistical optimality of neural network classification in a student-teacher framework. This talk is concluded by proposing sparsity induced training of neural network with statistical guarantee.