Kavli Affiliate: Wei Gao | First 5 Authors: Fengzhu Zeng, Wei Gao, , , | Summary: Few-shot or zero-shot fact verification only relies on a few or no labeled training examples. In this paper, we propose a novel method called ProToCo, to underline{Pro}mpt pre-trained language models (PLMs) underline{To} be underline{Co}nsistent, for improving the factuality assessment […]
Continue.. Prompt to be Consistent is Better than Self-Consistent? Few-Shot and Zero-Shot Fact Verification with Pre-trained Language Models