## A time-space lower bound for a large class of learning problems

We prove a general time-space lower bound that applies for a large class of learning problems and shows that for every problem in that class, any learning algorithm requires either a memory of quadratic size or an exponential number of samples. As a special case, this gives a new proof for the time-space lower bound for parity learning [R16].

## Rigid holomorphic curves are generically super-rigid

## Speculations about homological mirror symmetry for affine hypersurfaces

## IAS Facility Video

## Galois Representations for the general symplectic group

In a recent preprint with Sug Woo Shin (https://arxiv.org/abs/1609.04223) I construct Galois representations corresponding for cohomological cuspidal automorphic representations of general symplectic groups over totally real number fields under the local hypothesis that there is a Steinberg component. In this talk I will explain some parts of this construction that involve the eigenvariety.

## The stabilized symplectic embedding problem

## The many forms of rigidity for symplectic embeddings

## On structure results for intertwining operators

## Applications of twisted technology

Recently we proved with Durcik, Kovac, Skreb variational estimates providing sharp quantitative norm convergence results for bilinear ergodic averages with respect to two commuting transformations. The proof uses so called twisted technology developed in recent years for estimating bi-parameter paraproducts. Another application of the technique is to cancellation results for simplex Hilbert transforms.