Published October 30, 2024 | Version v1
Dataset Open

Probing the effects of broken symmetries in machine learning

  • 1. Laboratory of Computational Science and Modeling and National Centre for Computational Design and Discovery of Novel Materials MARVEL, Institute of Materials, École Polytechnique Fédérale de Lausanne, 1015 Lausanne, Switzerland

* Contact person

Description

Symmetry is one of the most central concepts in physics, and it is no surprise that it has also been widely adopted as an inductive bias for machine-learning models applied to the physical sciences. This is especially true for models targeting the properties of matter at the atomic scale. Both established and state-of-the-art approaches, with almost no exceptions, are built to be exactly equivariant to translations, permutations, and rotations of the atoms. Incorporating symmetries—rotations in particular—constrains the model design space and implies more complicated architectures that are often also computationally demanding. There are indications that unconstrained models can easily learn symmetries from data, and that doing so can even be beneficial for the accuracy of the model. We demonstrate that an unconstrained architecture can be trained to achieve a high degree of rotational invariance, testing the impacts of the small symmetry breaking in realistic scenarios involving simulations of gas-phase, liquid, and solid water. We focus specifically on physical observables that are likely to be affected—directly or indirectly—by non-invariant behavior under rotations, finding negligible consequences when the model is used in an interpolative, bulk, regime. Even for extrapolative gas-phase predictions, the model remains very stable, even though symmetry artifacts are noticeable. We also discuss strategies that can be used to systematically reduce the magnitude of symmetry breaking when it occurs, and assess their impact on the convergence of observables. This archive collect the input files, scripts, and data for the paper referenced below. In particular, it contains the trained MLIP for this work, the input files for simulations, the post-processing scripts and their outputs, as well as the plotting scripts and resulting figures. A detailed readme can be found below, and a more detailed one in each subfolder. The data in this archive is mirrored at https://github.com/sirmarcel/eqt-archive, where issues can be raised and discussed.

Files

File preview

files_description.md

All files

Files (127.3 MiB)

Name Size
md5:171a8254379b065d7ad3d0bc3c20bf4b
231 Bytes Preview Download
md5:23d0abd85d65cfba698f54edfa427b1e
127.3 MiB Preview Download
md5:5f7d89558068adc23efc3e28f415442e
2.1 KiB Preview Download

References

Journal reference (Paper associated with this data record)
Marcel F Langer et al 2024 Mach. Learn.: Sci. Technol. 5 04LT01, doi: 10.1088/2632-2153/ad86a0