Direct and dual Information Bottleneck frameworks for Deep Learning

Tali Tishby
The Hebrew University of Jerusalem
February 24, 2020

The Information Bottleneck (IB) is an information theoretic framework for optimal representation learning. It stems from the problem of finding minimal sufficient statistics in supervised learning, but has insightful implications for Deep Learning. In particular, its the only theory that gives concrete predictions on the different representations in each layer and their potential computational benefit. I will review the theory and its new version, the dual Information Bottleneck, related to the variational Information Bottleneck which is gaining practical popularity.

Classification of n-component links with Khovanov homology of rank 2^n

Boyu Zhang
February 24, 2020

Suppose L is a link with n components and the rank of Kh(L;Z/2) is 2^n, we show that L can be obtained by disjoint unions and connected sums of Hopf links and unknots. This result gives a positive answer to a question asked by Batson-Seed, and generalizes the unlink detection theorem of Khovanov homology by Hedden-Ni and Batson-Seed. The proof relies on a new excision formula for the singular instanton Floer homology introduced by Kronheimer and Mrowka.

This is joint work with Yi Xie.

What Is Global History? A Roundtable

February 20, 2020
Since its publication in 2016, Sebastian Conrad's What Is Global History? (Princeton University Press, 2016) has been read and debated not only by historians of modern Europe but also by historians of different parts of the world and scholars in different disciplines.

Who writes global history? How and for whom? And why now?

Bring your questions and thoughts, and join the conversation.

Sebastian Conrad
Professor of History, Free University of Berlin

in conversation with