Lechao Xiao

Lechao Xiao

Lechao is a research scientist in the Brain team at Google, where he is working on machine learning and deep learning. Prior to Google Brain, he was a Hans Rademacher Instructor of Mathematics at the University of Pennsylvania, where he was working on harmonic analysis. He earned his PhD in mathematics from the University of Illinois at Urbana-Champaign and his BA in pure and applied math from Zhejiang University, Hangzhou, China. Lechao research interests include theory of machine learning and deep learning, optimization, Gaussian process, generalization, etc. He is particularly interested in research problems that have a good combination of theory and practice. He developed (with his collaborators) a mean field theory for convolutional neural networks. He developed several novel initialization methods (orthogonal convolutional kernel and delta orthogonal kernel) which allow practitioners to train neural networks with more than 10,000 layers without the use of any common techniques.
Authored Publications
Sort By
  • Title
  • Title, descending
  • Year
  • Year, descending
    Google
Fast Neural Kernel Embeddings for General Activations
Insu Han
Amir Zandieh
Jaehoon Lee
Roman Novak
Amin Karbasi
NeurIPS 2022 (2022) (to appear)
Dataset Distillation with Infinitely Wide Convolutional Networks
Jaehoon Lee
Roman Novak
Timothy Chieu Nguyen
Neurips 2021 (2021) (to appear)
Exploring the Uncertainty Properties of Neural Networks’ Implicit Priors in the Infinite-Width Limit
Ben Adlam
Jaehoon Lee
Jeffrey Pennington
International Conference on Learning Representations, 2021, International Conference on Learning Representations, 2021, 27 pages
The Surprising Simplicity of the Early-Time Learning Dynamics of Neural Networks
Ben Adlam
Jeffrey Pennington
Wei Hu
Advances in Neural Information Processing Systems 34: Annual Conference on Neural Information Processing Systems 2020, NeurIPS 2020
Neural Tangents: Fast and Easy Infinite Neural Networks in Python
Roman Novak
Jiri Hron
Jaehoon Lee
Alex Alemi
Jascha Sohl-dickstein
Sam Schoenholz
ICLR (2020)
Finite versus Infinite Neural Networks:an Empirical Study
Jaehoon Lee
Sam S. Schoenholz
Jeffrey Pennington
Ben Adlam
Roman Novak
Jascha Sohl-dickstein
NeurIPS 2020
Wide Neural Networks of Any Depth Evolve as Linear Models Under Gradient Descent
Jaehoon Lee
Sam Schoenholz
Roman Novak
Jascha Sohl-dickstein
Jeffrey Pennington
NeurIPS (2019)
Bayesian Deep Convolutional Networks with Many Channels are Gaussian Processes
Roman Novak
Jaehoon Lee
Greg Yang
Jiri Hron
Dan Abolafia
Jeffrey Pennington
Jascha Sohl-dickstein
ICLR (2019)