Learning from Censored and Dependent Data

Constantinos Daskalakis
March 9, 2020
Machine Learning is invaluable for extracting insights from large volumes of data. A key assumption enabling many methods, however, is having access to training data comprising independent observations from the entire distribution of relevant data. In practice, data is commonly missing due to measurement limitations, legal restrictions, or data collection and sharing practices. Moreover, observations are commonly collected on a network, a spatial or a temporal domain and may be intricately dependent.

Towards a mathematical model of the brain

Lai-Sang Young
New York University; Distinguished Visiting Professor, School of Mathematics & Natural
March 9, 2020
Striving to make contact with mathematics and to be consistent with neuroanatomy at the same time, I propose an idealized picture of the cerebral cortex consisting of a hierarchical network of brain regions each further subdivided into interconnecting layers not unlike those in artificial neural networks. Each layer is idealized as a 2D sheet of neurons, spatially homogeneous with primarily local interactions, a setup reminiscent of that in statistical mechanics. Zooming into local circuits, one gets into the domain of dynamical systems.

Packing and squeezing Lagrangian tori

Richard Hind
University of Notre Dame
March 9, 2020
We will ask how many Lagrangian tori, say with an integral area class, can be `packed' into a given symplectic manifold. Similarly, given an arrangement of such tori, like the integral product tori in Euclidean space, one can ask about the symplectic size of the complement. The talk will describe some constructions of balls and Lagrangian tori which show the size is larger than expected.

This is based on joint work with Ely Kerman.

Higher order rectifiability and Reifenberg parametrizations

Silvia Ghinassi
Member, School of Mathematics
March 9, 2020
We provide geometric sufficient conditions for Reifenberg flat sets of any integer dimension in Euclidean space to be parametrized by a Lipschitz map with Hölder derivatives. The conditions use a Jones type square function and all statements are quantitative in that the Hölder and Lipschitz constants of the parametrizations depend on such a function. We use these results to prove sufficient conditions for higher order rectifiability of sets and measures.

Introduction to high dimensional expanders

Irit Dinur
Weizmann Institute of Science; Visiting Professor, School of Mathematics
March 10, 2020
High dimensional expansion generalizes edge and spectral expansion in graphs to hypergraphs (viewed as higher dimensional simplicial complexes). It is a tool that allows analysis of PCP agreement rests, mixing of Markov chains, and construction of new error correcting codes. My talk will be devoted to proving some nice relations between local and global expansion of these objects.

Your Brain on Energy-Based Models: Applying and Scaling EBMs to Problems of Interest to the Machine Learning Community Today

Will Grathwohl
University of Toronto
March 10, 2020
In this talk, I will discuss my two recent works on Energy-Based Models. In the first work, I discuss how we can reinterpret standard classification architectures as class conditional energy-based models and train them using recently proposed methods for large-scale EBM training. We find that adding EBM training in this way provides many benefits while negligibly affecting discriminative performance, contrary to other hybrid generative/discriminative modeling approaches.

Feature purification: How adversarial training can perform robust deep learning

Yuanzhi Li
Carnegie Mellon University
March 16, 2020
Why deep learning models, trained on many machine learning tasks, can obtain nearly perfect predictions of unseen data sampled from the same distribution but are extremely vulnerable to small perturbations of the input? How can adversarial training improve the robustness of the neural networks over such perturbations? In this work, we developed a new principle called "feature purification''.

Covariant Phase Space with Boundaries

Daniel Harlow
Massachusetts Institute of Technology
March 16, 2020
The Hamiltonian formulation of mechanics has many advantages, but its standard presentation destroys manifest covariance. This can be avoided by using the "covariant phase formalism" of Iyer and Wald, but until recently this formalism has suffered from several ambiguities related to boundary terms and total derivatives. In this talk I will present a new version of the formalism which incorporates boundary effects from the beginning.