Latest YouTube Video

Sunday, October 16, 2016

Language as a Latent Variable: Discrete Generative Models for Sentence Compression. (arXiv:1609.07317v2 [cs.CL] UPDATED)

In this work we explore deep generative models of text in which the latent representation of a document is itself drawn from a discrete language model distribution. We formulate a variational auto-encoder for inference in this model and apply it to the task of compressing sentences. In this application the generative model first draws a latent summary sentence from a background language model, and then subsequently draws the observed sentence conditioned on this latent summary. In our empirical evaluation we show that generative formulations of both abstractive and extractive compression yield state-of-the-art results when trained on a large amount of supervised data. Further, we explore semi-supervised compression scenarios where we show that it is possible to achieve performance competitive with previously proposed supervised models while training on a fraction of the supervised data.



from cs.AI updates on arXiv.org http://ift.tt/2d2aJ9M
via IFTTT

No comments: