Authors
Chen Xing, Wei Wu, Yu Wu, Jie Liu, Yalou Huang, Ming Zhou, Wei-Ying Ma
Publication date
2017
Conference
Proc. AAAI
Pages
3351-3357
Publisher
AAAI
Description
We consider incorporating topic information into a sequence-to-sequence framework to generate informative and interesting responses for chatbots. To this end, we propose a topic aware sequence-to-sequence (TA-Seq2Seq) model. The model utilizes topics to simulate prior human knowledge that guides them to form informative and interesting responses in conversation, and leverages topic information in generation by a joint attention mechanism and a biased generation probability. The joint attention mechanism summarizes the hidden vectors of an input message as context vectors by message attention and synthesizes topic vectors by topic attention from the topic words of the message obtained from a pre-trained LDA model, with these vectors jointly affecting the generation of words in decoding. To increase the possibility of topic words appearing in responses, the model modifies the generation probability of topic words by adding an extra probability item to bias the overall distribution. Empirical studies on both automatic evaluation metrics and human annotations show that TA-Seq2Seq can generate more informative and interesting responses, significantly outperforming state-of-the-art response generation models.
Total citations
20162017201820192020202120222023202471883120106114524824
Scholar articles
C Xing, W Wu, Y Wu, J Liu, Y Huang, M Zhou, WY Ma - Proceedings of the AAAI conference on artificial …, 2017
C Xing, W Wu, Y Wu, J Liu, Y Huang, M Zhou, WY Ma - arXiv preprint arXiv:1606.08340, 2016