YuLan: An Open-source Large Language Model

Kavli Affiliate: Feng Wang | First 5 Authors: Yutao Zhu, Kun Zhou, Kelong Mao, Wentong Chen, Yiding Sun | Summary: Large language models (LLMs) have become the foundation of many applications, leveraging their extensive capabilities in processing and understanding natural language. While many open-source LLMs have been released with technical reports, the lack of training […]


Continue.. YuLan: An Open-source Large Language Model

Rateless Stochastic Coding for Delay-constrained Semantic Communication

Kavli Affiliate: Cheng Peng | First 5 Authors: Cheng Peng, Rulong Wang, Yong Xiao, , | Summary: We consider the problem of joint source-channel coding with distortion and perception constraints from a rateless perspective, the purpose of which is to settle the balance between reliability (distortion/perception) and effectiveness (rate) of transmission over uncertain channels. We […]


Continue.. Rateless Stochastic Coding for Delay-constrained Semantic Communication

Epistasis between N-terminal and receptor-binding domains drives cell entry in a bat coronavirus spike

Kavli Affiliate: Charles M. Rice | Authors: Alexandra L. Tse, Cory M. Acreman, Inna Ricardo-Lax, Jacob Berrigan, Gorka Lasso, Toheeb Balogun, Fiona L. Kearns, Lorenzo Casalino, Georgia L. McClain, Amartya Mudry Chandran, Charlotte Lemeunier, Rommie E. Amaro, Charles M. Rice, Rohit K. Jangra, Jason S. McLellan, Kartik Chandran and Emily Happy Miller | Summary: The […]


Continue.. Epistasis between N-terminal and receptor-binding domains drives cell entry in a bat coronavirus spike

Structure of the human K2P13.1(THIK-1) channel reveals a novel hydrophilic pore restriction and lipid cofactor site

Kavli Affiliate: Daniel Minor | Authors: Shatabdi Roy-Chowdhury, Seil Jang, Fayal Abderemane-Ali, Fiona Naughton, Michael Grabe and Daniel L Minor, Jr. | Summary: The halothane-inhibited K2P leak potassium channel K2P13.1 (THIK-1)1–3 is found in diverse cells1,4 including neurons1,5 and microglia6–8 where it affects surveillance6, synaptic pruning7, phagocytosis7, and inflammasome-mediated interleukin-1β release6,8,9. As with many K2Ps1,5,10–14 […]


Continue.. Structure of the human K2P13.1(THIK-1) channel reveals a novel hydrophilic pore restriction and lipid cofactor site

Dataless Quadratic Neural Networks for the Maximum Independent Set Problem

Kavli Affiliate: Jia Liu | First 5 Authors: Ismail Alkhouri, Cedric Le Denmat, Yingjie Li, Cunxi Yu, Jia Liu | Summary: Combinatorial Optimization (CO) addresses many important problems, including the challenging Maximum Independent Set (MIS) problem. Alongside exact and heuristic solvers, differentiable approaches have emerged, often using continuous relaxations of ReLU-based or quadratic objectives. Noting […]


Continue.. Dataless Quadratic Neural Networks for the Maximum Independent Set Problem

Dataless Quadratic Neural Networks for the Maximum Independent Set Problem

Kavli Affiliate: Jia Liu | First 5 Authors: Ismail Alkhouri, Cedric Le Denmat, Yingjie Li, Cunxi Yu, Jia Liu | Summary: Combinatorial Optimization (CO) addresses many important problems, including the challenging Maximum Independent Set (MIS) problem. Alongside exact and heuristic solvers, differentiable approaches have emerged, often using continuous relaxations of ReLU-based or quadratic objectives. Noting […]


Continue.. Dataless Quadratic Neural Networks for the Maximum Independent Set Problem

On the minimum number of radiation field parameters to specify gas cooling and heating functions

Kavli Affiliate: Nickolay Y. Gnedin | First 5 Authors: David Robinson, Camille Avestruz, Nickolay Y. Gnedin, , | Summary: Fast and accurate approximations of gas cooling and heating functions are needed for hydrodynamic galaxy simulations. We use machine learning to analyze atomic gas cooling and heating functions in the presence of a generalized incident local […]


Continue.. On the minimum number of radiation field parameters to specify gas cooling and heating functions

The Remarkable Robustness of LLMs: Stages of Inference?

Kavli Affiliate: Max Tegmark | First 5 Authors: Vedang Lad, Wes Gurnee, Max Tegmark, , | Summary: We demonstrate and investigate the remarkable robustness of Large Language Models by deleting and swapping adjacent layers. We find that deleting and swapping interventions retain 72-95% of the original model’s prediction accuracy without fine-tuning, whereas models with more […]


Continue.. The Remarkable Robustness of LLMs: Stages of Inference?

SimTxtSeg: Weakly-Supervised Medical Image Segmentation with Simple Text Cues

Kavli Affiliate: Yi Zhou | First 5 Authors: Yuxin Xie, Tao Zhou, Yi Zhou, Geng Chen, | Summary: Weakly-supervised medical image segmentation is a challenging task that aims to reduce the annotation cost while keep the segmentation performance. In this paper, we present a novel framework, SimTxtSeg, that leverages simple text cues to generate high-quality […]


Continue.. SimTxtSeg: Weakly-Supervised Medical Image Segmentation with Simple Text Cues

DIM: Dynamic Integration of Multimodal Entity Linking with Large Language Model

Kavli Affiliate: Zhuo Li | First 5 Authors: Shezheng Song, Shasha Li, Jie Yu, Shan Zhao, Xiaopeng Li | Summary: Our study delves into Multimodal Entity Linking, aligning the mention in multimodal information with entities in knowledge base. Existing methods are still facing challenges like ambiguous entity representations and limited image information utilization. Thus, we […]


Continue.. DIM: Dynamic Integration of Multimodal Entity Linking with Large Language Model