Computer Science and Discrete Mathematics (CSDM)

Theoretical Computer Science and Discrete Mathematics

The Non-Stochastic Control Problem

Elad Hazan
Princeton University
May 18, 2020
Linear dynamical systems are a continuous subclass of reinforcement learning models that are widely used in robotics, finance, engineering, and meteorology. Classical control, since the work of Kalman, has focused on dynamics with Gaussian i.i.d. noise, quadratic loss functions and, in terms of provably efficient algorithms, known systems and observed state. We'll discuss how to apply new machine learning methods which relax all of the above: efficient control with adversarial noise, general loss functions, unknown systems, and partial observation.

Convex Set Disjointness, Distributed Learning of Halfspaces, and Linear Programming

Shay Moran
Member, School of Mathematics
May 12, 2020
Distributed learning protocols are designed to train on distributed data without gathering it all on a single centralized machine, thus contributing to the efficiency of the system and enhancing its privacy. We study a central problem in distributed learning, called Distributed Learning of Halfspaces: let U \subset R^d be a known domain of size n and let h:R^d —> R be an unknown target affine function. A set of examples {(u,b)} is distributed between several parties, where u \in U is a point and b = sign(h(u)) \in {-1, +1} is its label.

Using discrepancy theory to improve the design of randomized controlled trials

Daniel Spielman
Yale University
May 11, 2020
In randomized experiments, such as a medical trials, we randomly assign the treatment, such as a drug or a placebo, that each experimental subject receives. Randomization can help us accurately estimate the difference in treatment effects with high probability. We also know that we want the two groups to be similar: ideally the two groups would be similar in every statistic we can measure beforehand. Recent advances in algorithmic discrepancy theory allow us to divide subjects into groups with similar statistics.

Cutting Planes Proofs of Tseitin and Random Formulas

Noah Fleming
University of Toronto
May 5, 2020
Proof Complexity studies the length of proofs of propositional tautologies in various restricted proof systems. One of the most well-studied is the Cutting Planes proof system, which captures reasoning which can be expressed using linear inequalities. A series of papers proved lower bounds on the length of Cutting Planes using the method of feasible interpolation whereby proving lower bounds on the size of Cutting Planes lower bounds proofs of a certain restricted class of formulas is reduced to monotone circuit lower bounds.

Local Statistics, Semidefinite Programming, and Community Detection

Prasad Raghavendra
University of California, Berkeley
May 4, 2020
We propose a new hierarchy of semidefinite programming relaxations for inference problems. As test cases, we consider the problem of community detection in block models. The vertices are partitioned into k communities, and a graph is sampled conditional on a prescribed number of inter- and intra-community edges.

A Framework for Quadratic Form Maximization over Convex Sets

Vijay Bhattiprolu
Member, School of Mathematics
April 28, 2020
We investigate the approximability of the following optimization problem, whose input is an
n-by-n matrix A and an origin symmetric convex set C that is given by a membership oracle:
"Maximize the quadratic form as x ranges over C."

This is a rich and expressive family of optimization problems; for different choices of forms A
and convex bodies C it includes a diverse range of interesting combinatorial and continuous
optimization problems. To name some examples, max-cut, Grothendieck's inequality, the

Graph and Hypergraph Sparsification

Luca Trevisan
Bocconi University
April 27, 2020
A weighted graph H is a sparsifier of a graph G if H has much fewer edges than G and, in an appropriate technical sense, H "approximates" G. Sparsifiers are useful as compressed representations of graphs and to speed up certain graph algorithms. In a "cut sparsifier," the notion of approximation is that every cut is crossed by approximately the same number of edges in G as in H. In a "spectral sparsifier" a stronger, linear-algebraic, notion of approximation holds. Similar definitions can be given for hypergraphs.

Geodesically Convex Optimization (or, can we prove P!=NP using gradient descent)

Avi Wigderson
Herbert H. Maass Professor, School of Mathematics
April 21, 2020
This talk aims to summarize a project I was involved in during the past 5 years, with the hope of explaining our most complete understanding so far, as well as challenges and open problems. The main messages of this project are summarized below; I plan to describe, through examples, many of the concepts they refer to, and the evolution of ideas leading to them. No special background is assumed.

Legal Theorems of Privacy

Kobbi Nissim
Georgetown University
April 13, 2020
There are significant gaps between legal and technical thinking around data privacy. Technical standards such as k-anonymity and differential privacy are described using mathematical language whereas legal standards are not rigorous from a mathematical point of view and often resort to concepts such as de-identification and anonymization which they only partially define. As a result, arguments about the adequacy of technical privacy measures for satisfying legal privacy often lack rigor, and their conclusions are uncertain.