Oliver T. Unke
After receiving his Ph.D. in Chemistry from the University of Basel in 2019, Oliver T. Unke became an SNSF postdoctoral research fellow in the Machine Learning Group at Technische Universität Berlin. In 2021, he became a Visiting Researcher at Google and joined Google full time in 2022. His research has focused mainly on developing methods that connect machine learning with quantum chemistry, e.g. for constructing accurate potential energy surfaces and their application in molecular dynamics simulations, or predicting the wave functions of molecules.
Research Areas
Authored Publications
Sort By
Accurate global machine learning force fields for molecules with hundreds of atoms
Stefan Chmiela
Valentin Vassilev Galindo
Adil Kabylda
Huziel E. Sauceda
Alexandre Tkatchenko
Science Advances, 9(2) (2023), eadf0873
Preview abstract
Global machine learning force fields, with the capacity to capture collective interactions in molecular systems, now scale up to a few dozen atoms due to considerable growth of model complexity with system size. For larger molecules, locality assumptions are introduced, with the consequence that nonlocal interactions are not described. Here, we develop an exact iterative approach to train global symmetric gradient domain machine learning (sGDML) force fields (FFs) for several hundred atoms, without resorting to any potentially uncontrolled approximations. All atomic degrees of freedom remain correlated in the global sGDML FF, allowing the accurate description of complex molecules and materials that present phenomena with far-reaching characteristic correlation lengths. We assess the accuracy and efficiency of sGDML on a newly developed MD22 benchmark dataset containing molecules from 42 to 370 atoms. The robustness of our approach is demonstrated in nanosecond path-integral molecular dynamics simulations for supramolecular complexes in the MD22 dataset.
View details
So3krates - Self-attention for higher-order geometric interactions on arbitrary length-scales
Thorben Frank
Advances in Neural Information Processing Systems (2022) (to appear)
Preview abstract
The application of machine learning (ML) methods in quantum chemistry has enabled the study of numerous chemical phenomena, which are computationally intractable with traditional ab initio methods. However, some quantum mechanical properties of molecules and materials depend on non-local electronic effects, which are often neglected due to the difficulty of modelling them efficiently. This work proposes a modified attention mechanism adapted to the underlying physics, which allows to recover the relevant non-local effects. Namely, we introduce spherical harmonic coordinates (SPHCs) to reflect higher order geometric information for each atom in a molecule, enabling a non-local formulation of attention in the SPHC space. Our proposed model So3krates -- a self-attention based message passing neural network (MPNN) -- uncouples geometric information from atomic features, making them independently amenable to attention mechanisms. We show that in contrast to other published methods, So3krates is able to describe quantum mechanical effects due to orbital overlap over arbitrary length scales. Further, So3krates is shown to match or exceed state-of-the-art performance on the popular MD-17 and QM-7X benchmarks, notably, requiring a significantly lower number of parameters while at the same time giving a substantial speedup compared to other models.
View details
SpookyNet: Learning Force Fields with Electronic Degrees of Freedom and Nonlocal Effects
Stefan Chmiela
Michael Gastegger
Kristof T. Schütt
Huziel Saucceda
Nature Communications, 12 (2021), pp. 7273
Preview abstract
Machine-learned force fields combine the accuracy of ab initio methods with the efficiency of conventional force fields. However, current machine-learned force fields typically ignore electronic degrees of freedom, such as the total charge or spin state, and assume chemical locality, which is problematic when molecules have inconsistent electronic states, or when nonlocal effects play a significant role. This work introduces SpookyNet, a deep neural network for constructing machine-learned force fields with explicit treatment of electronic degrees of freedom and nonlocality, modeled via self-attention in a transformer architecture. Chemically meaningful inductive biases and analytical corrections built into the network architecture allow it to properly model physical limits. SpookyNet improves upon the current state-of-the-art (or achieves similar performance) on popular quantum chemistry data sets. Notably, it is able to generalize across chemical and conformational space and can leverage the learned chemical insights, e.g. by predicting unknown spin states, thus helping to close a further important remaining gap for today’s machine learning models in quantum chemistry.
View details
SE(3)-equivariant prediction of molecular wavefunctions and electronic densities
Mihail Bogojeski
Michael Gastegger
Mario Geiger
Tess Smidt
Advances in Neural Information Processing Systems (2021)
Preview abstract
Machine learning has enabled the prediction of quantum chemical properties with high accuracy and efficiency, allowing to bypass computationally costly ab initio calculations. Instead of training on a fixed set of properties, more recent approaches attempt to learn the electronic wavefunction (or density) as a central quantity of atomistic systems, from which all other observables can be derived. This is complicated by the fact that wavefunctions transform non-trivially under molecular rotations, which makes them a challenging prediction target. To solve this issue, we introduce general SE(3)-equivariant operations and building blocks for constructing deep learning architectures for geometric point cloud data and apply them to reconstruct wavefunctions of atomistic systems with unprecedented accuracy. Our model reduces prediction errors by up to two orders of magnitude compared to the previous state-of-the-art and makes it possible to derive properties such as energies and forces directly from the wavefunction in an end-to-end manner. We demonstrate the potential of our approach in a transfer learning application, where a model trained on low accuracy reference wavefunctions implicitly learns to correct for electronic many-body interactions from observables computed at a higher level of theory. Such machine-learned wavefunction surrogates pave the way towards novel semi-empirical methods, offering resolution at an electronic level while drastically decreasing computational cost. While we focus on physics applications in this contribution, the proposed equivariant framework for deep learning on point clouds is promising also beyond, say, in computer vision or graphics.
View details