Jerry Liu
ML + Numerics @ Stanford ICME & Hazy Lab · DOE CSGF Fellow · jerrywliu@stanford.edu

About
I’m a 3rd year PhD student in the Institute of Computational & Mathematical Engineering at Stanford, advised by Chris Ré. I am supported by the DOE Computational Science Graduate Fellowship. Previously, I completed my undergraduate degrees in Math and Computer Science at Duke, where I was advised by Cynthia Rudin. My path has been shaped by many kind and brilliant researchers, including Jin Yao, Kenny Weiss, Michael Mahoney, and Atri Rudra.
Research Interests
I’m broadly interested in working towards general-purpose machine learning models for science, particularly differential equations. Foundation models for language and vision have unlocked powerful new capabilities, but basic questions remain about the effectiveness of foundation models for regression-type tasks and continuous-valued data. My recent work investigates the fundamental limitations of existing ML techniques and develops more principled approaches for numerical tasks.
Some topics I’m interested in:
- Numerical Precision: Why do current ML methods struggle with precise numerical operations, and how can we develop better algorithms/architectures?
- Generalization: What’s the right notion of generalization in the context of continuous-valued regression tasks (e.g. PDEs)?
- Algorithmic Learning: How can ML methods learn generalizable, algorithmic knowledge directly from data?
Selected Publications
- Does In-Context Operator Learning Generalize to Domain-Shifted Settings?In The Symbiosis of Deep Learning and Differential Equations III @ NeurIPS, 2023