Unsupervised Pre-training With Seq2Seq Reconstruction Loss for Deep Relation Extraction Models
Published in ALTA, 2016
Recommended citation: Li, Z., Qu, L., Xu, Q., & Johnson, M. Unsupervised Pre-training with Sequence Reconstruction Loss for Deep Relation Extraction Models. Workshop of The Australasian Language Technology Association. https://www.aclweb.org/anthology/U16-1006/
Relation extraction models based on deep learning have been attracting a lot of attention recently. Little research is carried out to reduce their need of labeled training data. In this work, we propose an unsupervised pre-training method based on the sequence-to-sequence model for deep relation extraction models. The pre-trained models need only half or even less training data to achieve equivalent performance as the same models without pre-training.
Citation:
@inproceedings{li-etal-2016-unsupervised,
title = "Unsupervised Pre-training With {S}eq2{S}eq Reconstruction Loss for Deep Relation Extraction Models",
author = "Li, Zhuang and
Qu, Lizhen and
Xu, Qiongkai and
Johnson, Mark",
booktitle = "Proceedings of the Australasian Language Technology Association Workshop 2016",
month = dec,
year = "2016",
address = "Melbourne, Australia",
url = "https://www.aclweb.org/anthology/U16-1006",
pages = "54--64",
}