Compositional inductive biases in human function learning

Samuel J. Gershman
Harvard University
January 14, 2020
This talk presents evidence that humans learn complex functions by harnessing compositionality: complex structure is decomposed into simpler building blocks. I formalize this idea in the framework of Bayesian nonparametric regression using a grammar over Gaussian process kernels, and compare this approach with other structure learning approaches. People consistently chose compositional (over non-compositional) extrapolations and interpolations of functions.

How will we do mathematics in 2030 ?

Michael R. Douglas
Simons Center for Geometry and Physics, Stony Brook
December 17, 2019

We make the case that over the coming decade, computer assisted reasoning will become far more widely used in the mathematical sciences. This includes interactive and automatic theorem verification, symbolic algebra, and emerging technologies such as formal knowledge repositories, semantic search and intelligent textbooks.

Thresholds Versus Fractional Expectation-Thresholds

Keith Frankston
Rutgers University
December 16, 2019

Given an increasing family F in {0,1}^n, its measure according to mu_p increases and often exhibits a threshold behavior, growing quickly as p increases from near 0 to near 1 around a specific value p_c. Thresholds of families have been of great historical interest and a central focus of the study of random discrete structures (e.g. random graphs and hypergraphs), with estimation of thresholds for specific properties the subject of some of the most challenging work in the area.