Theme II - Towards Factual, Controllable and Versatile Text Generation 🤖

The success of deep text generation is being held back by its non-factuality and difficulty to control. How to unleash the power of text generation models while restraining them from talking gibberish? In this research theme, we explore the potentials of text generation w.r.t. factuality and controllability.

To achieve these goals, one of the key perspectives is to incorporate various prior knowledge into neural models, including logical rules, templates, external knowledge bases, etc. Exemplar papers in this theme include:

Jiangjie Chen
Jiangjie Chen
Researcher

His research interests mainly include large models and their reasoning and planning abilities.