Learning transition states: approximation, sampling, and optimization with rare data

Applied Mathematics Seminar, Courant Institute of Mathematical Sciences


Date
Feb 28, 2020 2:30 PM
Location
1302 Warren Weaver Hall
Transition path

The surprising flexibility and undeniable empirical success of machine learning algorithms has inspired many theoretical explanations for the efficacy of neural networks. Here, I will briefly introduce one perspective that provides not only asymptotic guarantees of trainability and accuracy in high-dimensional learning problems, but also provides some prescriptions and design principles for learning. Bolstered by the favorable scaling of these algorithms in high dimensional problems, I will turn to a central problem in computational condensed matter physics—that of computing reaction pathways. From the perspective of an applied mathematician, these problems typically appear hopeless; they are not only high-dimensional, but also dominated by rare events. However, with neural networks in the toolkit, at least the dimensionality is somewhat less intimidating. I will describe an algorithm that combines stochastic gradient descent with importance sampling to optimize a function representation of a reaction pathway for an arbitrary system. Finally, I will provide numerical evidence of the power and limitations of this approach.

Assistant Professor of Chemistry

Grant M. Rotskoff is an assistant professor at Stanford. He studies statistical mechanics with a focus on nonequilibrium phenomena.