Computer Science and Discrete Mathematics (CSDM)

Theoretical Computer Science and Discrete Mathematics

A Constant-factor Approximation Algorithm for the Asymmetric Traveling Salesman Problem

Ola Svensson
École polytechnique fédérale de Lausanne
January 23, 2018

We give a constant-factor approximation algorithm for the asymmetric traveling salesman problem. Our approximation guarantee is analyzed with respect to the standard LP relaxation, and thus our result confirms the conjectured constant integrality gap of that relaxation.

The Matching Problem in General Graphs is in Quasi-NC

Ola Svensson
École polytechnique fédérale de Lausanne
January 22, 2018

We show that the perfect matching problem in general graphs is in Quasi-NC. That is, we give a deterministic parallel algorithm which runs in polylogarithmic time on quasi-polynomially many processors. The result is obtained by a derandomization of the Isolation Lemma for perfect matchings, which was introduced in the classic paper by Mulmuley, Vazirani and Vazirani to obtain a Randomized NC algorithm.

A PSPACE construction of a hitting set for the closure of small algebraic circuits

Amir Shpilka
Tel Aviv University
December 12, 2017

We study the complexity of constructing a hitting set for the class of polynomials that can be infinitesimally approximated by polynomials that are computed by polynomial sized algebraic circuits, over the real or complex numbers. Specifically, we show that there is a PSPACE algorithm that given nsr in unary outputs a set of inputs from of size poly(nsr), with poly(nsr) bit complexity, that hits all $n$-variate polynomials of degree $r$ that are the limit of size $s$ algebraic circuits.

Recent advances in high dimensional robust statistics

Daniel Kane
University of California, San Diego
December 11, 2017
It is classically understood how to learn the parameters of a Gaussian even in high dimensions from independent samples. However, estimators like the sample mean are very fragile to noise. In particular, a single corrupted sample can arbitrarily distort the sample mean. More generally we would like to be able to estimate the parameters of a distribution even if a small fraction of the samples are corrupted, potentially adversarially.

A practical guide to deep learning

Richard Zemel
University of Toronto; Visitor, School of Mathematics
November 21, 2017
Neural networks have been around for many decades. An important question is what has led to their recent surge in performance and popularity. I will start with an introduction to deep neural networks, covering the terminology and standard approaches to constructing networks. I will focus on the two primary, very successful forms of networks: deep convolutional nets, as originally developed for vision problems; and recurrent networks, for speech and language tasks.

Learning models: connections between boosting, hard-core distributions, dense models, GAN, and regularity II

Russell Impagliazzo
University of California, San Diego
November 14, 2017

A theme that cuts across many domains of computer science and mathematics is to find simple representations of complex mathematical objects such as graphs, functions, or distributions on data. These representations need to capture how the object interacts with a class of tests, and to approximately determine the outcome of these tests.