EACO-RAG: Edge-Assisted and Collaborative RAG with Adaptive Knowledge Update

Kavli Affiliate: Feng Wang | First 5 Authors: Jiaxing Li, Chi Xu, Lianchen Jia, Feng Wang, Cong Zhang | Summary: Large Language Models are revolutionizing Web, mobile, and Web of Things systems, driving intelligent and scalable solutions. However, as Retrieval-Augmented Generation (RAG) systems expand, they encounter significant challenges related to scalability, including increased delay and […]


Continue.. EACO-RAG: Edge-Assisted and Collaborative RAG with Adaptive Knowledge Update

EACO-RAG: Towards Distributed Tiered LLM Deployment using Edge-Assisted and Collaborative RAG with Adaptive Knowledge Update

Kavli Affiliate: Feng Wang | First 5 Authors: Jiaxing Li, Chi Xu, Lianchen Jia, Feng Wang, Cong Zhang | Summary: Large language models (LLMs) have demonstrated impressive capabilities in language tasks, but they require high computing power and rely on static knowledge. To overcome these limitations, Retrieval-Augmented Generation (RAG) incorporates up-to-date external information into LLMs […]


Continue.. EACO-RAG: Towards Distributed Tiered LLM Deployment using Edge-Assisted and Collaborative RAG with Adaptive Knowledge Update

Counting Ability of Large Language Models and Impact of Tokenization

Kavli Affiliate: Xiang Zhang | First 5 Authors: Xiang Zhang, Juntai Cao, Chenyu You, , | Summary: Transformers, the backbone of modern large language models (LLMs), face inherent architectural limitations that impede their reasoning capabilities. Unlike recurrent networks, Transformers lack recurrent connections, confining them to constant-depth computation. This restriction places them in the complexity class […]


Continue.. Counting Ability of Large Language Models and Impact of Tokenization

Counting Ability of Large Language Models and Impact of Tokenization

Kavli Affiliate: Xiang Zhang | First 5 Authors: Xiang Zhang, Juntai Cao, Chenyu You, , | Summary: Transformers, the backbone of modern large language models (LLMs), face inherent architectural limitations that impede their reasoning capabilities. Unlike recurrent networks, Transformers lack recurrent connections, confining them to constant-depth computation. This restriction places them in the complexity class […]


Continue.. Counting Ability of Large Language Models and Impact of Tokenization

Modeling the Superlattice Phase Diagram of Transition Metal Intercalation in Bilayer 2H-TaS$_2$

Kavli Affiliate: David T. Limmer | First 5 Authors: Isaac M. Craig, B. Junsuh Kim, David T. Limmer, D. Kwabena Bediako, SinĂ©ad M. Griffin | Summary: Van der Waals hosts intercalated with transition metal (TM) ions exhibit a range of magnetic properties strongly influenced by the structural order of the intercalants. However, predictive computational models […]


Continue.. Modeling the Superlattice Phase Diagram of Transition Metal Intercalation in Bilayer 2H-TaS$_2$

Semi-supervised Chinese Poem-to-Painting Generation via Cycle-consistent Adversarial Networks

Kavli Affiliate: Feng Wang | First 5 Authors: Zhengyang Lu, Tianhao Guo, Feng Wang, , | Summary: Classical Chinese poetry and painting represent the epitome of artistic expression, but the abstract and symbolic nature of their relationship poses a significant challenge for computational translation. Most existing methods rely on large-scale paired datasets, which are scarce […]


Continue.. Semi-supervised Chinese Poem-to-Painting Generation via Cycle-consistent Adversarial Networks

Humanizing the Machine: Proxy Attacks to Mislead LLM Detectors

Kavli Affiliate: Xiang Zhang | First 5 Authors: Tianchun Wang, Yuanzhou Chen, Zichuan Liu, Zhanwen Chen, Haifeng Chen | Summary: The advent of large language models (LLMs) has revolutionized the field of text generation, producing outputs that closely mimic human-like writing. Although academic and industrial institutions have developed detectors to prevent the malicious usage of […]


Continue.. Humanizing the Machine: Proxy Attacks to Mislead LLM Detectors

Foundation Models in Electrocardiogram: A Review

Kavli Affiliate: Xiang Zhang | First 5 Authors: Yu Han, Xiaofeng Liu, Xiang Zhang, Cheng Ding, | Summary: The electrocardiogram (ECG) is ubiquitous across various healthcare domains, such as cardiac arrhythmia detection and sleep monitoring, making ECG analysis critically essential. Traditional deep learning models for ECG are task-specific, with a narrow scope of functionality and […]


Continue.. Foundation Models in Electrocardiogram: A Review

Atomistic understanding of hydrogen coverage on RuO2(110) surface under electrochemical conditions from ab initio statistical thermodynamics

Kavli Affiliate: Darrell G. Schlom | First 5 Authors: Lei Zhang, Jan Kloppenburg, Chia-Yi Lin, Luka Mitrovic, Simon Gelin | Summary: Understanding the dehydrogenation of transition metal oxide surfaces under electrochemical potential is critical to the control of important chemical processes such as the oxygen evolution reaction (OER). Using first principles computations, we model the […]


Continue.. Atomistic understanding of hydrogen coverage on RuO2(110) surface under electrochemical conditions from ab initio statistical thermodynamics

Atomistic understanding of hydrogen coverage on RuO2(110) surface under electrochemical conditions from ab initio statistical thermodynamics

Kavli Affiliate: Jin Suntivich | First 5 Authors: Lei Zhang, Jan Kloppenburg, Chia-Yi Lin, Luka Mitrovic, Simon Gelin | Summary: Understanding the dehydrogenation of transition metal oxide surfaces under electrochemical potential is critical to the control of important chemical processes such as the oxygen evolution reaction (OER). Using first principles computations, we model the thermodynamic […]


Continue.. Atomistic understanding of hydrogen coverage on RuO2(110) surface under electrochemical conditions from ab initio statistical thermodynamics