Joint work with Daniel Kane (UCSD) and Shachar Lovett (UCSD)

We construct near optimal linear decision trees for a variety of decision problems in combinatorics and discrete geometry.

For example, for any constant $k$, we construct linear decision trees that solve the $k$-SUM problem on $n$ elements using $O(n \log^2 n)$ linear queries. This settles a problem studied by [Meyer auf der Heide ’84, Meiser ‘93, Erickson ‘95, Ailon and Chazelle ‘05, Gronlund and Pettie '14, Gold and Sharir ’15, Cardinal et al '15, Ezra and Sharir ’16] and others.

The queries we use are comparison queries, which compare the sums of two $k$-subsets. When viewed as linear queries, comparison queries are $2k$-sparse and have only $\{-1,0,1\}$ coefficients. We give similar constructions for sorting sumsets $A+B$ and for deciding the SUBSET-SUM problem, both with optimal number of queries, up to poly-logarithmic terms.

Our constructions are based on the notion of ``inference dimension", recently introduced by the authors in the context of active classification with comparison queries. This can be viewed as another contribution to the fruitful link between machine learning and discrete geometry, which goes back to the discovery of the VC dimension.