School of Mathematics

Your Brain on Energy-Based Models: Applying and Scaling EBMs to Problems of Interest to the Machine Learning Community Today

Will Grathwohl
University of Toronto
March 10, 2020
In this talk, I will discuss my two recent works on Energy-Based Models. In the first work, I discuss how we can reinterpret standard classification architectures as class conditional energy-based models and train them using recently proposed methods for large-scale EBM training. We find that adding EBM training in this way provides many benefits while negligibly affecting discriminative performance, contrary to other hybrid generative/discriminative modeling approaches.

Higher order rectifiability and Reifenberg parametrizations

Silvia Ghinassi
Member, School of Mathematics
March 9, 2020
We provide geometric sufficient conditions for Reifenberg flat sets of any integer dimension in Euclidean space to be parametrized by a Lipschitz map with Hölder derivatives. The conditions use a Jones type square function and all statements are quantitative in that the Hölder and Lipschitz constants of the parametrizations depend on such a function. We use these results to prove sufficient conditions for higher order rectifiability of sets and measures.

Learning from Censored and Dependent Data

Constantinos Daskalakis
March 9, 2020
Machine Learning is invaluable for extracting insights from large volumes of data. A key assumption enabling many methods, however, is having access to training data comprising independent observations from the entire distribution of relevant data. In practice, data is commonly missing due to measurement limitations, legal restrictions, or data collection and sharing practices. Moreover, observations are commonly collected on a network, a spatial or a temporal domain and may be intricately dependent.

Towards a mathematical model of the brain

Lai-Sang Young
New York University; Distinguished Visiting Professor, School of Mathematics & Natural
March 9, 2020
Striving to make contact with mathematics and to be consistent with neuroanatomy at the same time, I propose an idealized picture of the cerebral cortex consisting of a hierarchical network of brain regions each further subdivided into interconnecting layers not unlike those in artificial neural networks. Each layer is idealized as a 2D sheet of neurons, spatially homogeneous with primarily local interactions, a setup reminiscent of that in statistical mechanics. Zooming into local circuits, one gets into the domain of dynamical systems.

Packing and squeezing Lagrangian tori

Richard Hind
University of Notre Dame
March 9, 2020
We will ask how many Lagrangian tori, say with an integral area class, can be `packed' into a given symplectic manifold. Similarly, given an arrangement of such tori, like the integral product tori in Euclidean space, one can ask about the symplectic size of the complement. The talk will describe some constructions of balls and Lagrangian tori which show the size is larger than expected.

This is based on joint work with Ely Kerman.

An introduction to Boolean Function Analysis

Dor Minzer
Member, School of Mathematics
March 3, 2020
An introduction to Boolean Function Analysis
We will discuss some of the basic principles and results in the study of Boolean-valued functions over the discrete hypercube using discrete Fourier analysis. In particular, we will talk about basic concepts, the hypercontractive inequality and the KKL theorem. Time permitting, we will discuss the Fourier-Entropy Conjecture and mention some recent progress towards it.

The talk is self-contained and no special background will be assumed.

An Improved Cutting Plane Method for Convex Optimization, Convex-Concave Games and its Applications

Zhao Song
Member, School of Mathematics
March 2, 2020
Strong Average-Case Circuit Lower Bounds from Non-trivial Derandomization
We prove that, unconditionally, for all constants a, NQP = NTIME[n^polylog(n)] cannot be (1/2 + 2^(-log^a n) )-approximated by 2^(log^a n)-size ACC^0 circuits. Previously, it was even open whether E^NP can be (1/2+1/sqrt(n))-approximated by AC^0[2] circuits. As a straightforward application, we obtain an infinitely often non-deterministic pseudorandom generator for poly-size ACC^0 circuits with seed length 2^{log^eps n}, for all eps > 0.

Lower Bounds in Complexity Theory, Communication Complexity, and Sunflowers

Toniann Pitassi
University of Toronto; Visiting Professor, School of Mathematics
March 2, 2020
In this talk I will discuss the Sunflower Lemma and similar lemmas that prove (in various contexts) that a set/distribution can be partitioned into a structured part and a "random-looking" part. I will introduce communication complexity as a key model for understanding computation and more generally for reasoning about information bottlenecks.

Preference Modeling with Context-Dependent Salient Features

Laura Balzano
University of Michigan; Member, School of Mathematics
February 27, 2020
This talk considers the preference modeling problem and addresses the fact that pairwise comparison data often reflects irrational choice, e.g. intransitivity. Our key observation is that two items compared in isolation from other items may be compared based on only a salient subset of features. Formalizing this idea, I will introduce our proposal for a “salient feature preference model” and discuss sample complexity results for learning the parameters of our model and the underlying ranking with maximum likelihood estimation.