Enforcing Paraphrase Generation via Controllable Latent Diffusion

Kavli Affiliate: Jia Liu

| First 5 Authors: Wei Zou, Ziyuan Zhuang, Shujian Huang, Jia Liu, Jiajun Chen

| Summary:

Paraphrase generation aims to produce high-quality and diverse utterances of
a given text. Though state-of-the-art generation via the diffusion model
reconciles generation quality and diversity, textual diffusion suffers from a
truncation issue that hinders efficiency and quality control. In this work, we
propose textit{L}atent textit{D}iffusion textit{P}araphraser~(LDP), a novel
paraphrase generation by modeling a controllable diffusion process given a
learned latent space. LDP achieves superior generation efficiency compared to
its diffusion counterparts. It facilitates only input segments to enforce
paraphrase semantics, which further improves the results without external
features. Experiments show that LDP achieves improved and diverse paraphrase
generation compared to baselines. Further analysis shows that our method is
also helpful to other similar text generations and domain adaptations. Our code
and data are available at https://github.com/NIL-zhuang/ld4pg.

| Search Query: ArXiv Query: search_query=au:”Jia Liu”&id_list=&start=0&max_results=3

Read More