## Supervised learning with missing values

Julie Josse

September 8, 2020

Julie Josse

September 8, 2020

Georgios Dimitroglou Rizell

Uppsala University

September 4, 2020

In a joint work with Laurent Côté we show the following result. Any Lagrangian plane in the cotangent bundle of an open Riemann surface which coincides with a cotangent fibre outside of some compact subset, is compactly supported Hamiltonian isotopic to that fibre. This result implies Hamiltonian unlinkedness for Lagrangian links in the cotangent bundle of a (possibly closed Riemann surface whose components are Hamiltonian isotopic to fibres.

Lorenz Eberhardt

Member, School of Natural Sciences, Institute for Advanced Study

August 28, 2020

I discuss string theory on AdS3xS3xT4 in the tensionless limit, with one unit of NS-NS flux. This theory is conjectured to be dual to the symmetric product orbifold CFT. I show how to compute the full string partition function on various locally AdS3 backgrounds, such as thermal AdS3, the BTZ black and conical defects, and find that it is independent of the actual background, but only depends on the boundary geometry.

Inderjit Dhillon

University of Texas, Austin

August 27, 2020

Many challenging problems in modern applications amount to finding relevant results from an enormous output space of potential candidates, for example, finding the best matching product from a large catalog or suggesting related search phrases on a search engine. The size of the output space for these problems can be in the millions to billions. Moreover, observational or training data is often limited for many of the so-called “long-tail” of items in the output space.

Piotr Indyk

Massachusetts Institute of Technology

August 25, 2020

Classical algorithms typically provide "one size fits all" performance, and do not leverage properties or patterns in their inputs. A recent line of work aims to address this issue by developing algorithms that use machine learning predictions to improve their performance. In this talk I will present two examples of this type, in the context of streaming and sketching algorithms.

Andy Neitzke

Yale University

August 24, 2020

Over the last decade it has become clear that there is a close connection between the BPS sector of N=2 supersymmetric field theories in four dimensions and the exact WKB method for analysis of ordinary differential equations (Schrodinger equations and their higher-order analogues). I will review the basic players in this story and some of the main results, and describe some outstanding puzzles.

Jason Eisner

Johns Hopkins University

August 20, 2020

Suppose you are monitoring discrete events in real time. Can you predict what events will happen in the future, and when? Can you fill in past events that you may have missed? A probability model that supports such reasoning is the neural Hawkes process (NHP), in which the Poisson intensities of K event types at time t depend on the history of past events. This autoregressive architecture can capture complex dependencies. It resembles an LSTM language model over K word types, but allows the LSTM state to evolve in continuous time.

Li Deng

Citadel

August 18, 2020

A brief review will be provided first on how deep learning has disrupted speech recognition and language processing industries since 2009. Then connections will be drawn between the techniques (deep learning or otherwise) for modeling speech and language and those for financial markets. Similarities and differences of these two fields will be explored. In particular, three unique technical challenges to financial investment are addressed: extremely low signal-to-noise ratio, extremely strong nonstationarity (with adversarial nature), and heterogeneous big data.

Xi Dong

University of California, Santa Barbara

August 17, 2020

I will present enhanced corrections to the entanglement entropy of a subsystem in holographic states. These corrections appear near phase transitions in the entanglement entropy due to competing extremal surfaces, and they are holographic avatars of similar corrections previously found in chaotic energy eigenstates. I will first show explicitly how to find these corrections in chaotic eigenstates by summing over contributions of all bulk saddle point solutions, including those that break the replica symmetry, and by making use of fixed-area states.

John Langford

Microsoft Research

August 13, 2020

There are three core orthogonal problems in reinforcement learning: (1) Crediting actions (2) generalizing across rich observations (3) Exploring to discover the information necessary for learning. Good solutions to pairs of these problems are fairly well known at this point, but solutions for all three are just now being discovered. I’ll discuss several such results and dive into details on a few of them.