Members Seminar

The Palais-Smale Theorem and the Solution of Hilbert’s 23 Problem

Karen Uhlenbeck
The University of Texas at Austin; Distinguished Visiting Professor, School of Mathematics
April 6, 2020
Hilbert’s 23rd Problem is the last in his famous list of problems and is of a different character than the others. The description is several pages, and basically says that the calculus of variations is a subject which needs development. We will look in retrospect at one of the critical events in the calculus of variations: The point at which the critical role of dimension was understood, and the role that the Palais-Smale condition(1963) played in this understanding. I apologize that in its present state, the talk consists mostly of my reminiscences and lacks references.

Towards a mathematical model of the brain

Lai-Sang Young
New York University; Distinguished Visiting Professor, School of Mathematics & Natural
March 9, 2020
Striving to make contact with mathematics and to be consistent with neuroanatomy at the same time, I propose an idealized picture of the cerebral cortex consisting of a hierarchical network of brain regions each further subdivided into interconnecting layers not unlike those in artificial neural networks. Each layer is idealized as a 2D sheet of neurons, spatially homogeneous with primarily local interactions, a setup reminiscent of that in statistical mechanics. Zooming into local circuits, one gets into the domain of dynamical systems.

Lower Bounds in Complexity Theory, Communication Complexity, and Sunflowers

Toniann Pitassi
University of Toronto; Visiting Professor, School of Mathematics
March 2, 2020
In this talk I will discuss the Sunflower Lemma and similar lemmas that prove (in various contexts) that a set/distribution can be partitioned into a structured part and a "random-looking" part. I will introduce communication complexity as a key model for understanding computation and more generally for reasoning about information bottlenecks.

Direct and dual Information Bottleneck frameworks for Deep Learning

Tali Tishby
The Hebrew University of Jerusalem
February 24, 2020

The Information Bottleneck (IB) is an information theoretic framework for optimal representation learning. It stems from the problem of finding minimal sufficient statistics in supervised learning, but has insightful implications for Deep Learning. In particular, its the only theory that gives concrete predictions on the different representations in each layer and their potential computational benefit. I will review the theory and its new version, the dual Information Bottleneck, related to the variational Information Bottleneck which is gaining practical popularity.

The h-principle in symplectic geometry

Emmy Murphy
Northwestern University; von Neumann Fellow, School of Mathematics
December 9, 2019

Symplectic geometry, and its close relative contact geometry, are geometries closely tied to complex geometry, smooth topology, and mathematical physics. The h-principle is a general method used for construction of smooth geometric objects satisfying various underdetermined properties. In the symplectic context, h-principles typically give constructions of surprising exotica, and methods for detecting the basic flexible objects. We survey a number of results from the previous decade.