Exploring the design space of machine-learning models for quantum chemistry with a fully differentiable framework
Creators
- 1. Laboratory of Computational Science and Modeling, Institut des Matériaux, École Polytechnique Fédérale de Lausanne (EPFL), 1015 Lausanne, Switzerland
- 2. Division of Chemistry and Chemical Engineering, California Institute of Technology, Pasadena, CA 91125, USA
* Contact person
Description
Traditional atomistic machine learning (ML) models serve as surrogates for quantum mechanical (QM) properties, predicting quantities such as dipole moments and polarizabilities, directly from compositions and geometries of atomic configurations. With the emergence of ML approaches to predict the "ingredients" of a QM calculation, such as the ground state charge density or the effective single-particle Hamiltonian, it has become possible to obtain multiple properties through analytical physics-based operations on these intermediate ML predictions. We present a framework to seamlessly integrate the prediction of an effective electronic Hamiltonian, for both molecular and condensed-phase systems, with PySCFAD, a differentiable QM workflow. This integration facilitates training models indirectly against functions of the Hamiltonian such as electronic energy levels, dipole moments, polarizability, etc. We then use this framework to explore various possible choices within the design space of hybrid ML/QM models, examining the influence of incorporating multiple targets on model performance and learning a reduced-basis ML Hamiltonian that can reproduce targets computed from a much larger basis. Our benchmarks evaluate the accuracy and transferability of these hybrid models, compare them against predictions of atomic properties from their surrogate models, and provide indications to guide the design of the interface between the ML and QM components of the model. For our benchmarks we have used a subset of the QM7 and QM9 datasets as well as two extrapolative datasets for long chain polyalkenes/acenes and polyenoic acid series, along with a small graphene dataset.
Files
File preview
files_description.md
All files
References
Preprint (Preprint where the data is discussed) D. Suman, J. Nigam, S. Saade, P. Pegolo, H. Tuerk, X. Zhang, G.K. Chan, and M. Ceriotti, (2025). arXiv preprint.
Software (Software used to generate machine learning outputs) J. Nigam, P. Pegolo, and M. Ceriotti, "Integrating ML for Hamiltonian and electronic structure," https://github.com/curiosity54/mlelec (2024).