Multi-source Unsupervised Domain Adaptation on Graphs with Transferability Modeling

Kavli Affiliate: Xiang Zhang | First 5 Authors: Tianxiang Zhao, Dongsheng Luo, Xiang Zhang, Suhang Wang, | Summary: In this paper, we tackle a new problem of textit{multi-source unsupervised domain adaptation (MSUDA) for graphs}, where models trained on annotated source domains need to be transferred to the unsupervised target graph for node classification. Due to […]


Continue.. Multi-source Unsupervised Domain Adaptation on Graphs with Transferability Modeling

Early Developmental Origins of Cortical Disorders Modeled in Human Neural Stem Cells

Kavli Affiliate: Flora Vaccarino | Authors: Xoel Mato-Blanco, Suel-Kee Kim, Alexandre Jourdon, Shaojie Ma, Andrew T.N. Tebbenkamp, Fuchen Liu, Alvaro Duque, Flora M. Vaccarino, Nenad Sestan, Carlo Colantuoni, Pasko Rakic, Gabriel Santpere and Nicola Micali | Summary: The implications of the early phases of human telencephalic development, involving neural stem cells (NSCs), in the etiology […]


Continue.. Early Developmental Origins of Cortical Disorders Modeled in Human Neural Stem Cells

Byzantine-Robust Decentralized Federated Learning

Kavli Affiliate: Jia Liu | First 5 Authors: Minghong Fang, Zifan Zhang, Hairi, Prashant Khanduri, Jia Liu | Summary: Federated learning (FL) enables multiple clients to collaboratively train machine learning models without revealing their private training data. In conventional FL, the system follows the server-assisted architecture (server-assisted FL), where the training process is coordinated by […]


Continue.. Byzantine-Robust Decentralized Federated Learning

Byzantine-Robust Decentralized Federated Learning

Kavli Affiliate: Jia Liu | First 5 Authors: Minghong Fang, Zifan Zhang, Hairi, Prashant Khanduri, Jia Liu | Summary: Federated learning (FL) enables multiple clients to collaboratively train machine learning models without revealing their private training data. In conventional FL, the system follows the server-assisted architecture (server-assisted FL), where the training process is coordinated by […]


Continue.. Byzantine-Robust Decentralized Federated Learning

GA-NIFS: JWST/NIRSpec IFS view of the z~3.5 galaxy GS5001 and its close environment at the core of a large-scale overdensity

Kavli Affiliate: Roberto Maiolino | First 5 Authors: Isabella Lamperti, Santiago Arribas, Michele Perna, Bruno Rodríguez Del Pino, Chiara Circosta | Summary: We present JWST NIRSpec observations in IFS mode of the galaxy GS5001 at redshift z=3.47, the brightest member of a candidate protocluster in the GOODS-S field. The data cover a field of view […]


Continue.. GA-NIFS: JWST/NIRSpec IFS view of the z~3.5 galaxy GS5001 and its close environment at the core of a large-scale overdensity

Bursty Star Formation in Dwarfs is Sensitive to Numerical Choices in Supernova Feedback Models

Kavli Affiliate: Mark Vogelsberger | First 5 Authors: Eric Zhang, Laura V Sales, Federico Marinacci, Paul Torrey, Mark Vogelsberger | Summary: Simulations of galaxy formation are mostly unable to resolve the energy-conserving phase of individual supernova events, having to resort to subgrid models to distribute the energy and momentum resulting from stellar feedback. However, the […]


Continue.. Bursty Star Formation in Dwarfs is Sensitive to Numerical Choices in Supernova Feedback Models

In-situ aligned all-polarization-maintaining Er-doped fiber laser mode-locked by a nonlinear amplifying loop mirror

Kavli Affiliate: Xiang Zhang | First 5 Authors: Xiang Zhang, Kangrui Chang, Haobin Zheng, Yongzhuang Zhou, Yong Shen | Summary: Despite the wide applications for high-repetition-rate mode-locked fiber lasers, challenges persist in shortening the cavity length and coupling the fiber collimators for most existing techniques. Here, we introduce a novel collimator alignment method and demonstrate […]


Continue.. In-situ aligned all-polarization-maintaining Er-doped fiber laser mode-locked by a nonlinear amplifying loop mirror

BLEnD: A Benchmark for LLMs on Everyday Knowledge in Diverse Cultures and Languages

Kavli Affiliate: Yi Zhou | First 5 Authors: Junho Myung, Nayeon Lee, Yi Zhou, Jiho Jin, Rifki Afina Putri | Summary: Large language models (LLMs) often lack culture-specific knowledge of daily life, especially across diverse regions and non-English languages. Existing benchmarks for evaluating LLMs’ cultural sensitivities are limited to a single language or collected from […]


Continue.. BLEnD: A Benchmark for LLMs on Everyday Knowledge in Diverse Cultures and Languages