LLM and GNN are Complementary: Distilling LLM for Multimodal Graph Learning

Kavli Affiliate: Xiang Zhang

| First 5 Authors: Junjie Xu, Zongyu Wu, Minhua Lin, Xiang Zhang, Suhang Wang

| Summary:

Recent progress in Graph Neural Networks (GNNs) has greatly enhanced the
ability to model complex molecular structures for predicting properties.
Nevertheless, molecular data encompasses more than just graph structures,
including textual and visual information that GNNs do not handle well. To
bridge this gap, we present an innovative framework that utilizes multimodal
molecular data to extract insights from Large Language Models (LLMs). We
introduce GALLON (Graph Learning from Large Language Model Distillation), a
framework that synergizes the capabilities of LLMs and GNNs by distilling
multimodal knowledge into a unified Multilayer Perceptron (MLP). This method
integrates the rich textual and visual data of molecules with the structural
analysis power of GNNs. Extensive experiments reveal that our distilled MLP
model notably improves the accuracy and efficiency of molecular property
predictions.

| Search Query: ArXiv Query: search_query=au:”Xiang Zhang”&id_list=&start=0&max_results=3

Read More