Latest YouTube Video

Tuesday, November 8, 2016

A Joint Many-Task Model: Growing a Neural Network for Multiple NLP Tasks. (arXiv:1611.01587v1 [cs.CL])

Transfer and multi-task learning have traditionally focused on either a single source-target pair or very few, similar tasks. Ideally, the linguistic levels of morphology, syntax and semantics would benefit each other by being trained in a single model. We introduce such a joint many-task model together with a strategy for successively growing its depth to solve increasingly complex tasks. All layers include shortcut connections to both word representations and lower-level task predictions. We use a simple regularization term to allow for optimizing all model weights to improve one task's loss without exhibiting catastrophic interference of the other tasks. Our single end-to-end trainable model obtains state-of-the-art results on chunking, dependency parsing, semantic relatedness and textual entailment. It also performs competitively on POS tagging. Our dependency parsing layer relies only on a single feed-forward pass and does not require a beam search.



from cs.AI updates on arXiv.org http://ift.tt/2fVB5xW
via IFTTT

No comments: