# Members Seminar

## Lower Bounds in Complexity Theory, Communication Complexity, and Sunflowers

## Direct and dual Information Bottleneck frameworks for Deep Learning

The Information Bottleneck (IB) is an information theoretic framework for optimal representation learning. It stems from the problem of finding minimal sufficient statistics in supervised learning, but has insightful implications for Deep Learning. In particular, its the only theory that gives concrete predictions on the different representations in each layer and their potential computational benefit. I will review the theory and its new version, the dual Information Bottleneck, related to the variational Information Bottleneck which is gaining practical popularity.

## Spectra of metric graphs and crystalline measures

## Coarse dynamics and partially hyperbolic diffeomorphisms in 3-manifolds

## Knotted 3-balls in the 4-sphere

## The h-principle in symplectic geometry

Symplectic geometry, and its close relative contact geometry, are geometries closely tied to complex geometry, smooth topology, and mathematical physics. The h-principle is a general method used for construction of smooth geometric objects satisfying various underdetermined properties. In the symplectic context, h-principles typically give constructions of surprising exotica, and methods for detecting the basic flexible objects. We survey a number of results from the previous decade.

## Mathematical models of human memory

## Convergence of nearest neighbor classification

## Lie algebras and homotopy theory

In this talk, I'll discuss the role that Lie algebras play in algebraic topology and motivate the development of a "homotopy coherent" version of the theory. I'll also explain an "equation-free" formulation of the classical theory of Lie algebras, which emerges as a concrete byproduct.