Hot electron diffusion, microwave noise, and piezoresistivity in Si from first principles

Kavli Affiliate: Austin J. Minnich | First 5 Authors: Benjamin Hatanpää, Austin J. Minnich, , , | Summary: Ab-initio calculations of charge transport properties in materials without adjustable parameters have provided microscopic insights into electron-phonon interactions which govern charge transport properties. Other transport properties such as the diffusion coefficient provide additional microscopic information and are […]


Continue.. Hot electron diffusion, microwave noise, and piezoresistivity in Si from first principles

Towards LLM-based Fact Verification on News Claims with a Hierarchical Step-by-Step Prompting Method

Kavli Affiliate: Wei Gao | First 5 Authors: Xuan Zhang, Wei Gao, , , | Summary: While large pre-trained language models (LLMs) have shown their impressive capabilities in various NLP tasks, they are still under-explored in the misinformation domain. In this paper, we examine LLMs with in-context learning (ICL) for news claim verification, and find […]


Continue.. Towards LLM-based Fact Verification on News Claims with a Hierarchical Step-by-Step Prompting Method

Noise Reduction Methods for Large-scale Intensity-mapping Measurements with Infrared Detector Arrays

Kavli Affiliate: James Bock | First 5 Authors: Grigory Heaton, Walter Cook, James Bock, Jill Burnham, Sam Condon | Summary: Intensity mapping observations measure galaxy clustering fluctuations from spectral-spatial maps, requiring stable noise properties on large angular scales. We have developed specialized readouts and analysis methods for achieving large-scale noise stability with Teledyne 2048$times$2048 H2RG […]


Continue.. Noise Reduction Methods for Large-scale Intensity-mapping Measurements with Infrared Detector Arrays

Tackling the Unlimited Staleness in Federated Learning with Intertwined Data and Device Heterogeneities

Kavli Affiliate: Wei Gao | First 5 Authors: Haoming Wang, Wei Gao, , , | Summary: The efficiency of Federated Learning (FL) is often affected by both data and device heterogeneities. Data heterogeneity is defined as the heterogeneity of data distributions on different clients. Device heterogeneity is defined as the clients’ variant latencies in uploading […]


Continue.. Tackling the Unlimited Staleness in Federated Learning with Intertwined Data and Device Heterogeneities

Tackling the Unlimited Staleness in Federated Learning with Intertwined Data and Device Heterogeneities

Kavli Affiliate: Wei Gao | First 5 Authors: Haoming Wang, Wei Gao, , , | Summary: The efficiency of Federated Learning (FL) is often affected by both data and device heterogeneities. Data heterogeneity is defined as the heterogeneity of data distributions on different clients. Device heterogeneity is defined as the clients’ variant latencies in uploading […]


Continue.. Tackling the Unlimited Staleness in Federated Learning with Intertwined Data and Device Heterogeneities

Tackling the Unlimited Staleness in Federated Learning with Intertwined Data and Device Heterogeneities

Kavli Affiliate: Wei Gao | First 5 Authors: Haoming Wang, Wei Gao, , , | Summary: Federated Learning (FL) can be affected by data and device heterogeneities, caused by clients’ different local data distributions and latencies in uploading model updates (i.e., staleness). Traditional schemes consider these heterogeneities as two separate and independent aspects, but this […]


Continue.. Tackling the Unlimited Staleness in Federated Learning with Intertwined Data and Device Heterogeneities

Tackling Intertwined Data and Device Heterogeneities in Federated Learning with Unlimited Staleness

Kavli Affiliate: Wei Gao | First 5 Authors: Haoming Wang, Wei Gao, , , | Summary: Federated Learning (FL) can be affected by data and device heterogeneities, caused by clients’ different local data distributions and latencies in uploading model updates (i.e., staleness). Traditional schemes consider these heterogeneities as two separate and independent aspects, but this […]


Continue.. Tackling Intertwined Data and Device Heterogeneities in Federated Learning with Unlimited Staleness

Towards Green AI in Fine-tuning Large Language Models via Adaptive Backpropagation

Kavli Affiliate: Wei Gao | First 5 Authors: Kai Huang, Hanyun Yin, Heng Huang, Wei Gao, | Summary: Fine-tuning is the most effective way of adapting pre-trained large language models (LLMs) to downstream applications. With the fast growth of LLM-enabled AI applications and democratization of open-souced LLMs, fine-tuning has become possible for non-expert individuals, but […]


Continue.. Towards Green AI in Fine-tuning Large Language Models via Adaptive Backpropagation

Towards Green AI in Fine-tuning Large Language Models via Adaptive Backpropagation

Kavli Affiliate: Wei Gao | First 5 Authors: Kai Huang, Hanyun Yin, Heng Huang, Wei Gao, | Summary: Fine-tuning is the most effective way of adapting pre-trained large language models (LLMs) to downstream applications. With the fast growth of LLM-enabled AI applications and democratization of open-souced LLMs, fine-tuning has become possible for non-expert individuals, but […]


Continue.. Towards Green AI in Fine-tuning Large Language Models via Adaptive Backpropagation

Deep Learning with Photonic Neural Cellular Automata

Kavli Affiliate: Alireza Marandi | First 5 Authors: Gordon H. Y. Li, Christian R. Leefmans, James Williams, Robert M. Gray, Midya Parto | Summary: Rapid advancements in deep learning over the past decade have fueled an insatiable demand for efficient and scalable hardware. Photonics offers a promising solution by leveraging the unique properties of light. […]


Continue.. Deep Learning with Photonic Neural Cellular Automata