Efficiently Learning Mixtures of Gaussians

Given data drawn from a mixture of multivariate Gaussians, a basic problem is to accurately estimate the mixture parameters. We provide a polynomial-time algorithm for this problem for any fixed number ($k$) of Gaussians in $n$ dimensions (even if they overlap), with provably minimal assumptions on the Gaussians and polynomial data requirements. In statistical terms, our estimator converges at an inverse polynomial rate, and no such estimator (even exponential time) was known for this problem (even in one dimension, restricted to two Gaussians). Our algorithm reduces the $n$-dimensional problem to the one dimensional problem, where the method of moments is applied. Additionally, in order to prove correctness for our univariate learning algorithm, we develop a novel explanation for why the method of moments (due to Pearson in 1894) works based on connections to algebraic geometry.

This is joint work with Adam Tauman Kalai and Gregory Valiant. This work appears as "Efficiently Learning Mixtures of Two Gaussians" (STOC 2010) and "Settling The Polynomial Learnability of Mixtures of Gaussians" (FOCS 2010).

Date

Affiliation

Massachusetts Institute of Technology