MoDULA: Mixture of Domain-Specific and Universal LoRA for Multi-Task Learning

Kavli Affiliate: Ran Wang | First 5 Authors: Yufei Ma, Zihan Liang, Huangyu Dai, Ben Chen, Dehong Gao | Summary: The growing demand for larger-scale models in the development of textbf{L}arge textbf{L}anguage textbf{M}odels (LLMs) poses challenges for efficient training within limited computational resources. Traditional fine-tuning methods often exhibit instability in multi-task learning and rely heavily […]


Continue.. MoDULA: Mixture of Domain-Specific and Universal LoRA for Multi-Task Learning

Relativistic Mott transition in strongly correlated artificial graphene

Kavli Affiliate: Jie Shan | First 5 Authors: Liguo Ma, Raghav Chaturvedi, Phuong X. Nguyen, Kenji Watanabe, Takashi Taniguchi | Summary: The realization of graphene has provided a bench-top laboratory for quantum electrodynamics. The low-energy excitations of graphene are two-dimensional massless Dirac fermions with opposite chiralities at the $pm$K valleys of the graphene Brillouin zone. […]


Continue.. Relativistic Mott transition in strongly correlated artificial graphene