School of Mathematics

Direct and dual Information Bottleneck frameworks for Deep Learning

Tali Tishby
The Hebrew University of Jerusalem
February 24, 2020

The Information Bottleneck (IB) is an information theoretic framework for optimal representation learning. It stems from the problem of finding minimal sufficient statistics in supervised learning, but has insightful implications for Deep Learning. In particular, its the only theory that gives concrete predictions on the different representations in each layer and their potential computational benefit. I will review the theory and its new version, the dual Information Bottleneck, related to the variational Information Bottleneck which is gaining practical popularity.

Classification of n-component links with Khovanov homology of rank 2^n

Boyu Zhang
February 24, 2020

Suppose L is a link with n components and the rank of Kh(L;Z/2) is 2^n, we show that L can be obtained by disjoint unions and connected sums of Hopf links and unknots. This result gives a positive answer to a question asked by Batson-Seed, and generalizes the unlink detection theorem of Khovanov homology by Hedden-Ni and Batson-Seed. The proof relies on a new excision formula for the singular instanton Floer homology introduced by Kronheimer and Mrowka.

This is joint work with Yi Xie.