Multi-level Attention-Based Neural Networks for Distant Supervised Relation Extraction

Files in This Item:
File Description SizeFormat 
insight_publication.pdf1.34 MBAdobe PDFDownload
Title: Multi-level Attention-Based Neural Networks for Distant Supervised Relation Extraction
Authors: Yang, Linyi
Lok, Tin
Ng, James
Mooney, Catherine
Dong, Ruihai
Permanent link:
Date: 8-Dec-2017
Abstract: We propose a multi-level attention-based neural network forrelation extraction based on the work of Lin et al. to alleviate the problemof wrong labelling in distant supervision. In this paper, we first adoptgated recurrent units to represent the semantic information. Then, weintroduce a customized multi-level attention mechanism, which is expectedto reduce the weights of noisy words and sentences. Experimentalresults on a real-world dataset show that our model achieves significantimprovement on relation extraction tasks compared to both traditionalfeature-based models and existing neural network-based methods
Funding Details: Science Foundation Ireland
Type of material: Conference Publication
Publisher: Insight Centre
Keywords: Relation extraction;Distant supervision;Word-level attention
Language: en
Status of Item: Peer reviewed
Conference Details: 25th Irish Conference on Artificial Intelligence and Cognitive Science, Dublin, Ireland, 7-8 December 2017
Appears in Collections:Insight Research Collection

Show full item record

Google ScholarTM


This item is available under the Attribution-NonCommercial-NoDerivs 3.0 Ireland. No item may be reproduced for commercial purposes. For other possible restrictions on use please refer to the publisher's URL where this is made available, or to notes contained in the item itself. Other terms may apply.