Cross-functional transferability in universal machine learning interatomic potentials

Kavli Affiliate: Kristin A. Persson

| First 5 Authors: Xu Huang, Bowen Deng, Peichen Zhong, Aaron D. Kaplan, Kristin A. Persson

| Summary:

The rapid development of universal machine learning interatomic potentials
(uMLIPs) has demonstrated the possibility for generalizable learning of the
universal potential energy surface. In principle, the accuracy of uMLIPs can be
further improved by bridging the model from lower-fidelity datasets to
high-fidelity ones. In this work, we analyze the challenge of this transfer
learning problem within the CHGNet framework. We show that significant energy
scale shifts and poor correlations between GGA and r$^2$SCAN pose challenges to
cross-functional data transferability in uMLIPs. By benchmarking different
transfer learning approaches on the MP-r$^2$SCAN dataset of 0.24 million
structures, we demonstrate the importance of elemental energy referencing in
the transfer learning of uMLIPs. By comparing the scaling law with and without
the pre-training on a low-fidelity dataset, we show that significant data
efficiency can still be achieved through transfer learning, even with a target
dataset of sub-million structures. We highlight the importance of proper
transfer learning and multi-fidelity learning in creating next-generation
uMLIPs on high-fidelity data.

| Search Query: ArXiv Query: search_query=au:”Kristin A. Persson”&id_list=&start=0&max_results=3

Read More