Improved Paraphrase Generation via Controllable Latent Diffusion

Kavli Affiliate: Jia Liu

| First 5 Authors: Wei Zou, Ziyuan Zhuang, Xiang Geng, Shujian Huang, Jia Liu

| Summary:

Paraphrase generation strives to generate high-quality and diverse
expressions of a given text, a domain where diffusion models excel. Though SOTA
diffusion generation reconciles generation quality and diversity, textual
diffusion suffers from a truncation issue that hinders efficiency and quality
control. In this work, we propose textit{L}atent textit{D}iffusion
textit{P}araphraser~(LDP), a novel paraphrase generation by modeling a
controllable diffusion process given a learned latent space. LDP achieves
superior generation efficiency compared to its diffusion counterparts. It can
facilitate only input segments to ensure paraphrase semantics, improving the
results without external features. Experiments show that LDP better reconciles
paraphrase generation quality and diversity than baselines. Further analysis
shows that our method is also helpful to other similar text generations and
domain adaptations

| Search Query: ArXiv Query: search_query=au:”Jia Liu”&id_list=&start=0&max_results=3

Read More