Large Language Models Are Innate Crystal Structure Generators

Kavli Affiliate: Kristin A. Persson

| First 5 Authors: Jingru Gan, Peichen Zhong, Yuanqi Du, Yanqiao Zhu, Chenru Duan

| Summary:

Crystal structure generation is fundamental to materials discovery, enabling
the prediction of novel materials with desired properties. While existing
approaches leverage Large Language Models (LLMs) through extensive fine-tuning
on materials databases, we show that pre-trained LLMs can inherently generate
stable crystal structures without additional training. Our novel framework
MatLLMSearch integrates pre-trained LLMs with evolutionary search algorithms,
achieving a 78.38% metastable rate validated by machine learning interatomic
potentials and 31.7% DFT-verified stability via quantum mechanical
calculations, outperforming specialized models such as CrystalTextLLM. Beyond
crystal structure generation, we further demonstrate that our framework can be
readily adapted to diverse materials design tasks, including crystal structure
prediction and multi-objective optimization of properties such as deformation
energy and bulk modulus, all without fine-tuning. These results establish
pre-trained LLMs as versatile and effective tools for materials discovery,
opening up new venues for crystal structure generation with reduced
computational overhead and broader accessibility.

| Search Query: ArXiv Query: search_query=au:”Kristin A. Persson”&id_list=&start=0&max_results=3

Read More