Recently Added

Graph Nets: The Next Generation

Max Welling
university of Amsterdam
July 21, 2020
In this talk I will introduce our next generation of graph neural networks. GNNs have the property that they are invariant to permutations of the nodes in the graph and to rotations of the graph as a whole. We claim this is unnecessarily restrictive and in this talk we will explore extensions of these GNNs to more flexible equivariant constructions. In particular, Natural Graph Networks for general graphs are globally equivariant under permutations of the nodes but can still be executed through local message passing protocols.

Non-perturbative Studies of JT Gravity and Supergravity using Minimal Strings

Clifford V. Johnson
University of Southern California
July 20, 2020
Various Jackiw–Teitelboim (JT) gravity and supergravity theories have been shown (by Saad, Shenker and Stanford, and by Stanford and Witten) to have double scaled random matrix model descriptions, capturing the (spacetime) topological perturbative expansion of the partition function using beautiful recursive methods developed by Mirzakhani and others. I will describe an alternative method for building the matrix model description, using techniques from minimal string theory. This method is particularly useful for supplying non-perturbative definitions of the physics.

Knot Floer homology and bordered algebras

Peter Ozsváth
Princeton University
July 10, 2020
Knot Floer homology is an invariant for knots in three-space, defined as a Lagrangian Floer homology in a symmetric product. It has the form of a bigraded vector space, encoding topological information about the knot. I will discuss an algebraic approach to computing knot Floer homology, and a corresponding version for links, based on decomposing knot diagrams.

This is joint work with Zoltan Szabo, building on earlier joint work (bordered Heegaard Floer homology) with Robert Lipshitz and Dylan Thurston.

Role of Interaction in Competitive Optimization

Anima Anandkumar
California Institute of Technology
July 9, 2020
Competitive optimization is needed for many ML problems such as training GANs, robust reinforcement learning, and adversarial learning. Standard approaches to competitive optimization involve each agent independently optimizing their objective functions using SGD or other gradient-based approaches. However, they suffer from oscillations and instability, since the optimization does not account for interaction among the players. We introduce competitive gradient descent (CGD) that explicitly incorporates interaction by solving for Nash equilibrium of a local game.

Machine learning-based design (of proteins, small molecules and beyond)

Jennifer Listgarten
University of California, Berkeley
July 7, 2020
Data-driven design is making headway into a number of application areas, including protein, small-molecule, and materials engineering. The design goal is to construct an object with desired properties, such as a protein that binds to a target more tightly than previously observed. To that end, costly experimental measurements are being replaced with calls to a high-capacity regression model trained on labeled data, which can be leveraged in an in silico search for promising design candidates.

Infinite staircases and reflexive polygons

Ana Rita Pires
University of Edinburgh
July 3, 2020
A classic result, due to McDuff and Schlenk, asserts that the function that encodes when a four-dimensional symplectic ellipsoid can be embedded into a four-dimensional ball has a remarkable structure: the function has infinitely many corners, determined by the odd-index Fibonacci numbers, that fit together to form an infinite staircase. The work of McDuff and Schlenk has recently led to considerable interest in understanding when the ellipsoid embedding function for other symplectic 4-manifolds is partly described by an infinite staircase.

Distinguishing monotone Lagrangians via holomorphic annuli

Ailsa Keating
University of Cambridge
June 26, 2020
We present techniques for constructing families of compact, monotone (including exact) Lagrangians in certain affine varieties, starting with Brieskorn-Pham hypersurfaces. We will focus on dimensions 2 and 3. In particular, we'll explain how to set up well-defined counts of holomorphic annuli for a range of these families. Time allowing, we will give a number of applications.

Instance-Hiding Schemes for Private Distributed Learning

Sanjeev Arora
Princeton University; Distinguishing Visiting Professor, School of Mathematics
June 25, 2020
An important problem today is how to allow multiple distributed entities to train a shared neural network on their private data while protecting data privacy. Federated learning is a standard framework for distributed deep learning Federated Learning, and one would like to assure full privacy in that framework . The proposed methods, such as homomorphic encryption and differential privacy, come with drawbacks such as large computational overhead or large drop in accuracy.

Generalizable Adversarial Robustness to Unforeseen Attacks

Soheil Feizi
University of Maryland
June 23, 2020
In the last couple of years, a lot of progress has been made to enhance robustness of models against adversarial attacks. However, two major shortcomings still remain: (i) practical defenses are often vulnerable against strong “adaptive” attack algorithms, and (ii) current defenses have poor generalization to “unforeseen” attack threat models (the ones not used in training).