# School of Mathematics

## The singular set in the fully nonlinear obstacle problem

For the Obstacle Problem involving a convex fully nonlinear elliptic operator, we show that the singular set of the free boundary stratifies. The top stratum is locally covered by a $C^{1,\alpha}$-manifold, and the lower strata are covered by $C^{1,\log^\eps}$-manifolds. This essentially recovers the regularity result obtained by Figalli-Serra when the operator is the Laplacian.

## An isoperimetric inequality for the Hamming cube and some consequences

I will introduce an isoperimetric inequality for the Hamming cube and some of its applications. The applications include a “stability” version of Harper’s edge-isoperimetric inequality, which was first proved by Friedgut, Kalai and Naor for half cubes, and later by Ellis for subsets of any size. Our inequality also plays a key role in a recent result on the asymptotic number of maximal independent sets in the cube.

This is joint work with Jeff Kahn.

## Unitary, Symplectic, and Orthogonal Moments of Moments

The study of random matrix moments of moments has connections to number theory, combinatorics, and log-correlated fields. Our results give the leading order of these functions for integer moments parameters by exploiting connections with Gelfand-Tsetlin patterns and counts of lattice points in convex sets. This is joint work with Jon Keating and Theo Assiotis.

## Extreme eigenvalue distributions of sparse random graphs

## Effective bounds for the least solutions of homogeneous quadratic Diophantine inequalities

## Some Statistical Results on Deep Learning: Interpolation, Optimality and Sparsity

This talk discusses three aspects of deep learning from a statistical perspective: interpolation, optimality and sparsity. The first one attempts to interpret the double descent phenomenon by precisely characterizing a U-shaped curve within the “over-fitting regime,” while the second one focuses on the statistical optimality of neural network classification in a student-teacher framework. This talk is concluded by proposing sparsity induced training of neural network with statistical guarantee.

## Fast IRLS Algorithms for p-norm regression

## Lie algebras and homotopy theory

In this talk, I'll discuss the role that Lie algebras play in algebraic topology and motivate the development of a "homotopy coherent" version of the theory. I'll also explain an "equation-free" formulation of the classical theory of Lie algebras, which emerges as a concrete byproduct.