Latest YouTube Video

Monday, December 21, 2015

Backbone Language Modeling for Constrained Natural Language Generation. (arXiv:1512.06612v1 [cs.CL])

Recent language models, especially those based on recurrent neural networks (RNNs), make it possible to generate natural language from a learned probability. Language generation has wide applications including machine translation, summarization, question answering, conversation systems, etc. Existing methods typically learn a joint probability of words conditioned on additional information, which is (either statically or dynamically) fed to RNN's hidden layer. In many applications, we are likely to impose hard constraints on the generated texts, i.e., a particular word must appear in the sentence. Unfortunately, existing methods could not solve this problem. In this paper, we propose a backbone language model (backbone LM) for constrained language generation. Provided a specific word, our model generates previous words and future words simultaneously. In this way, the given word could appear at any position in the sentence. Experimental results show that the generated texts are coherent and fluent.

Donate to arXiv



from cs.AI updates on arXiv.org http://ift.tt/1TdPAqC
via IFTTT

No comments: