Diversified Paraphrase Generation with Commonsense Knowledge Graph

Abstract

Paraphrases refer to text with different expressions conveying the same meaning, which is usually modeled as a sequence-to-sequence (Seq2Seq) learning problem. Traditional Seq2Seq models mainly concentrate on fidelity while ignoring the diversity of paraphrases. Although recent studies begin to focus on the diversity of generated paraphrases, they either adopt inflexible control mechanisms or restrict to synonyms and topic knowledge. In this paper, we propose KnowledgE-Enhanced Paraphraser (KEEP) for diversified paraphrase generation, which leverages a commonsense knowledge graph to explicitly enrich the expressions of paraphrases. Specifically, KEEP retrieves word-level and phrase-level knowledge from an external knowledge graph, and learns to choose more related ones using graph attention mechanism. Extensive experiments on benchmarks of paraphrase generation show the strengths especially in the diversity of our proposed model compared with several strong baselines.

Type
Publication
In The 10th CCF International Conference on Natural Language Processing and Chinese Computing (NLPCC 2021) (oral)
Jiangjie Chen
Jiangjie Chen
Researcher

His research interests mainly include large models and their reasoning and planning abilities.