The singular set in the fully nonlinear obstacle problem

Ovidiu Savin
Columbia University
November 18, 2019

For the Obstacle Problem involving a convex fully nonlinear elliptic operator, we show that the singular set of the free boundary stratifies. The top stratum is locally covered by a $C^{1,\alpha}$-manifold, and the lower strata are covered by $C^{1,\log^\eps}$-manifolds. This essentially recovers the regularity result obtained by Figalli-Serra when the operator is the Laplacian.

An isoperimetric inequality for the Hamming cube and some consequences

Jinyoung Park
Rutgers University
November 18, 2019

I will introduce an isoperimetric inequality for the Hamming cube and some of its applications. The applications include a “stability” version of Harper’s edge-isoperimetric inequality, which was first proved by Friedgut, Kalai and Naor for half cubes, and later by Ellis for subsets of any size. Our inequality also plays a key role in a recent result on the asymptotic number of maximal independent sets in the cube. 

 

This is joint work with Jeff Kahn.

Deforming Holomorphic Chern-Simons at Large N

Si Li
Member, School of Natural Sciences, IAS; Tsinghua University
November 15, 2019

We describe the coupling of holomorphic Chern-Simons theory at large N with Kodaira-Spencer gravity. This gives a complete description of open-closed string field theory in the topological B-model. We explain an anomaly cancellation mechanism at all loops in perturbation theory in this model. At one loop this anomaly cancellation is analogous to the Green-Schwarz mechanism. This is joint work with Kevin Costello. 

Unitary, Symplectic, and Orthogonal Moments of Moments

Emma Bailey
University of Bristol
November 15, 2019

The study of random matrix moments of moments has connections to number theory, combinatorics, and log-correlated fields. Our results give the leading order of these functions for integer moments parameters by exploiting connections with Gelfand-Tsetlin patterns and counts of lattice points in convex sets. This is joint work with Jon Keating and Theo Assiotis. 

Some Statistical Results on Deep Learning: Interpolation, Optimality and Sparsity

Guang Cheng
Purdue University; Member, School of Mathematics
November 13, 2019

This talk discusses three aspects of deep learning from a statistical perspective: interpolation, optimality and sparsity. The first one attempts to interpret the double descent phenomenon by precisely characterizing a U-shaped curve within the “over-fitting regime,” while the second one focuses on the statistical optimality of neural network classification in a student-teacher framework. This talk is concluded by proposing sparsity induced training of neural network with statistical guarantee.