Imaging the Meissner effect and flux trapping in a hydride superconductor at megabar pressures using a nanoscale quantum sensor

Kavli Affiliate: Joel E. Moore | First 5 Authors: Prabudhya Bhattacharyya, Wuhao Chen, Xiaoli Huang, Shubhayu Chatterjee, Benchen Huang | Summary: By directly altering microscopic interactions, pressure provides a powerful tuning knob for the exploration of condensed phases and geophysical phenomena. The megabar regime represents an exciting frontier, where recent discoveries include novel high-temperature superconductors, […]


Continue.. Imaging the Meissner effect and flux trapping in a hydride superconductor at megabar pressures using a nanoscale quantum sensor

Measurement of $C!P$ asymmetries and branching-fraction ratios for $B^pm to DK^pm$ and $Dπ^pm$ with $Dto K^0_{rm S} K^pmπ^mp$ using Belle and Belle II data

Kavli Affiliate: T. Higuchi | First 5 Authors: Belle, Belle II Collaboration, :, I. Adachi, L. Aggarwal | Summary: We measure $C!P$ asymmetries and branching-fraction ratios for $B^pm to DK^pm$ and $Dpi^pm$ decays with $Dto K^0_{rm S} K^pmpi^mp$, where $D$ is a superposition of $D^0$ and $bar{D}^0$. We use the full data set of the […]


Continue.. Measurement of $C!P$ asymmetries and branching-fraction ratios for $B^pm to DK^pm$ and $Dπ^pm$ with $Dto K^0_{rm S} K^pmπ^mp$ using Belle and Belle II data

Search for a long-lived spin-0 mediator in $bto s$ transitions at the Belle II experiment

Kavli Affiliate: T. Higuchi | First 5 Authors: Belle II Collaboration, I. Adachi, K. Adamczyk, L. Aggarwal, H. Aihara | Summary: Additional spin-0 particles appear in many extensions of the standard model. We search for long-lived spin-0 particles $S$ in $B$-meson decays mediated by a $bto s$ quark transition in $e^+e^-$ collisions at the $Upsilon(4S)$ […]


Continue.. Search for a long-lived spin-0 mediator in $bto s$ transitions at the Belle II experiment

Early Rumor Detection Using Neural Hawkes Process with a New Benchmark Dataset

Kavli Affiliate: Wei Gao | First 5 Authors: Fengzhu Zeng, Wei Gao, , , | Summary: Little attention has been paid on underline{EA}rly underline{R}umor underline{D}etection (EARD), and EARD performance was evaluated inappropriately on a few datasets where the actual early-stage information is largely missing. To reverse such situation, we construct BEARD, a new underline{B}enchmark dataset […]


Continue.. Early Rumor Detection Using Neural Hawkes Process with a New Benchmark Dataset

Early Rumor Detection Using Neural Hawkes Process with a New Benchmark Dataset

Kavli Affiliate: Wei Gao | First 5 Authors: Fengzhu Zeng, Wei Gao, , , | Summary: Little attention has been paid on underline{EA}rly underline{R}umor underline{D}etection (EARD), and EARD performance was evaluated inappropriately on a few datasets where the actual early-stage information is largely missing. To reverse such situation, we construct BEARD, a new underline{B}enchmark dataset […]


Continue.. Early Rumor Detection Using Neural Hawkes Process with a New Benchmark Dataset

MotionTrack: Learning Motion Predictor for Multiple Object Tracking

Kavli Affiliate: Xiang Zhang | First 5 Authors: Changcheng Xiao, Qiong Cao, Yujie Zhong, Long Lan, Xiang Zhang | Summary: Significant progress has been achieved in multi-object tracking (MOT) through the evolution of detection and re-identification (ReID) techniques. Despite these advancements, accurately tracking objects in scenarios with homogeneous appearance and heterogeneous motion remains a challenge. […]


Continue.. MotionTrack: Learning Motion Predictor for Multiple Object Tracking

Prompt to be Consistent is Better than Self-Consistent? Few-Shot and Zero-Shot Fact Verification with Pre-trained Language Models

Kavli Affiliate: Wei Gao | First 5 Authors: Fengzhu Zeng, Wei Gao, , , | Summary: Few-shot or zero-shot fact verification only relies on a few or no labeled training examples. In this paper, we propose a novel method called ProToCo, to underline{Pro}mpt pre-trained language models (PLMs) underline{To} be underline{Co}nsistent, for improving the factuality assessment […]


Continue.. Prompt to be Consistent is Better than Self-Consistent? Few-Shot and Zero-Shot Fact Verification with Pre-trained Language Models

Prompt to be Consistent is Better than Self-Consistent? Few-Shot and Zero-Shot Fact Verification with Pre-trained Language Models

Kavli Affiliate: Wei Gao | First 5 Authors: Fengzhu Zeng, Wei Gao, , , | Summary: Few-shot or zero-shot fact verification only relies on a few or no labeled training examples. In this paper, we propose a novel method called ProToCo, to underline{Pro}mpt pre-trained language models (PLMs) underline{To} be underline{Co}nsistent, for improving the factuality assessment […]


Continue.. Prompt to be Consistent is Better than Self-Consistent? Few-Shot and Zero-Shot Fact Verification with Pre-trained Language Models

Judging the difficulty of perceptual decisions

Kavli Affiliate: Michael Shadlen | Authors: Anne Löffler, Ariel Zylberberg, Michael N. Shadlen and Daniel M Wolpert | Summary: Deciding how difficult it is going to be to perform a task allows us to choose between tasks, allocate appropriate resources, and predict future performance. To be useful for planning, difficulty judgments should not require completion […]


Continue.. Judging the difficulty of perceptual decisions

Random noise promotes slow heterogeneous synaptic dynamics important for robust working memory computation

Kavli Affiliate: Terrence Sejnowski | Authors: Nuttida Rungratsameetaweemana, Robert Kim, Thiparat Chotibut and Terrence Sejnowski | Summary: Recurrent neural networks (RNNs) based on model neurons that communicate via continuous signals have been widely used to study how cortical neurons perform cognitive tasks. Training such networks to perform tasks that require information maintenance over a brief […]


Continue.. Random noise promotes slow heterogeneous synaptic dynamics important for robust working memory computation