Ilya Tolstikhin

Ilya Tolstikhin

Between 2014 and 2018 I was a postdoc (and later a team lead) at the Empirical Inference Department of Max Planck Institute for Intelligent Systems, Tübingen, Germany. I received a diploma (MSc equivalent) in 2010 from Lomonosov Moscow State University and PhD in 2014 from Dorodnicyn Computing Center of Russian Academy of Sciences.
Currently I am actively interested in understanding neural network training and generalization. Previously I worked on statistical learning theory and more generally on theory of machine learning.

Research Areas

Authored Publications
Sort By
  • Title
  • Title, descending
  • Year
  • Year, descending
    Google
Fine-Grained Distribution-Dependent Learning Curves
Jonathan Shafer
Shay Moran
Steve Hanneke
Proceedings of Thirty Sixth Conference on Learning Theory (COLT), PMLR 195:5890-5924, 2023. (2023)
MLP-Mixer: An All-MLP Architecture for Vision
Neil Houlsby
Alexander Kolesnikov
Lucas Beyer
Xiaohua Zhai
Thomas Unterthiner
Jessica Yung
Jakob Uszkoreit
Alexey Dosovitskiy
NeurIPS 2021 (poster)
When can unlabeled data improve the learning rate?
Christina Göpfert
Shai Ben-David
Sylvain Gelly
Ruth Urner
COLT 2019
Practical and Consistent Estimation of f-Divergences
Paul Rubenstein
Josip Djolonga
Carlos Riquelme
Submission to Neurips 2019. (2019) (to appear)