Legal Theorems of Privacy

Kobbi Nissim
Georgetown University
April 13, 2020
There are significant gaps between legal and technical thinking around data privacy. Technical standards such as k-anonymity and differential privacy are described using mathematical language whereas legal standards are not rigorous from a mathematical point of view and often resort to concepts such as de-identification and anonymization which they only partially define. As a result, arguments about the adequacy of technical privacy measures for satisfying legal privacy often lack rigor, and their conclusions are uncertain.

A New Topological Symmetry of Asymptotically Flat Spacetimes

Uri Kol
New York University
April 13, 2020
Abstract: I will show that the isometry group of asymptotically flat spacetimes contains, in addition to the BMS group, a new dual supertranslation symmetry. The corresponding new conserved charges are akin to the large magnetic U(1) charges in QED. They factorize the Hilbert space of asymptotic states into distinct super-selection sectors and reveal a rich topological structure exhibited by the asymptotic metric.

Towards Robust Artificial Intelligence

Pushmeet Kohli
April 15, 2020
Deep learning has led to rapid progress being made in the field of machine learning and artificial intelligence, leading to dramatically improved solutions of many challenging problems such as image understanding, speech recognition, and control systems. Despite these remarkable successes, researchers have observed some intriguing and troubling aspects of the behaviour of these models. A case in point is the presence of adversarial examples which make learning based systems fail in unexpected ways.

A snapshot of few-shot classification

Richard Zemel
April 15, 2020
Few-shot classification, the task of adapting a classifier to unseen classes given a small labeled dataset, is an important step on the path toward human-like machine learning. I will present some of the key advances in this area, and will then focus on the fundamental issue of overfitting in the few-shot scenario. Bayesian methods are well-suited to tackling this issue because they allow practitioners to specify prior beliefs and update those beliefs in light of observed data.

Iterative Random Forests (iRF) with applications to genomics and precision medicine

Bin Yu
April 15, 2020
Genomics has revolutionized biology, enabling the interrogation of whole transcriptomes, genome-wide binding sites for proteins, and many other molecular processes. However, individual genomic assays measure elements that interact in vivo as components of larger molecular machines. Understanding how these high-order interactions drive gene expression presents a substantial statistical challenge.

Generative Modeling by Estimating Gradients of the Data Distribution

Stefano Ermon
April 15, 2020
Existing generative models are typically based on explicit representations of probability distributions (e.g., autoregressive or VAEs) or implicit sampling procedures (e.g., GANs). We propose an alternative approach based on modeling directly the vector field of gradients of the data distribution (scores). Our framework allows flexible energy-based model architectures, requires no sampling during training or the use of adversarial training methods.

Evaluating Lossy Compression Rates of Deep Generative Models

Roger Grosse
April 15, 2020
Implicit generative models such as GANs have achieved remarkable progress at generating convincing fake images, but how well do they really match the distribution? Log-likelihood has been used extensively to evaluate generative models whenever it’s convenient to do so, but measuring log-likelihoods for implicit generative models presents computational challenges. Furthermore, in order to obtain a density, one needs to smooth the distribution using a noisy model (typically Gaussian), and this choice is hard to motivate.

Local-global compatibility in the crystalline case

Ana Caraiani
Imperial College
April 16, 2020
Let F be a CM field. Scholze constructed Galois representations associated to classes in the cohomology of locally symmetric spaces for GL_n/F with p-torsion coefficients. These Galois representations are expected to satisfy local-global compatibility at primes above p. Even the precise formulation of this property is subtle in general, and uses Kisin’s potentially semistable deformation rings. However, this property is crucial for proving modularity lifting theorems. I will discuss joint work with J.

Do Simpler Models Exist and How Can We Find Them?

Cynthia Rudin
April 16, 2020
While the trend in machine learning has tended towards more complex hypothesis spaces, it is not clear that this extra complexity is always necessary or helpful for many domains. In particular, models and their predictions are often made easier to understand by adding interpretability constraints. These constraints shrink the hypothesis space; that is, they make the model simpler. Statistical learning theory suggests that generalization may be improved as a result as well. However, adding extra constraints can make optimization (exponentially) harder.