Options
Multi-level Attention-Based Neural Networks for Distant Supervised Relation Extraction
Date Issued
2017-12-08
Date Available
2018-04-09T09:26:20Z
Abstract
We propose a multi-level attention-based neural network forrelation extraction based on the work of Lin et al. to alleviate the problemof wrong labelling in distant supervision. In this paper, we first adoptgated recurrent units to represent the semantic information. Then, weintroduce a customized multi-level attention mechanism, which is expectedto reduce the weights of noisy words and sentences. Experimentalresults on a real-world dataset show that our model achieves significantimprovement on relation extraction tasks compared to both traditionalfeature-based models and existing neural network-based methods
Sponsorship
Science Foundation Ireland
Type of Material
Conference Publication
Publisher
Insight Centre
Web versions
Language
English
Status of Item
Peer reviewed
Conference Details
25th Irish Conference on Artificial Intelligence and Cognitive Science, Dublin, Ireland, 7-8 December 2017
This item is made available under a Creative Commons License
File(s)
Loading...
Name
insight_publication.pdf
Size
1.31 MB
Format
Adobe PDF
Checksum (MD5)
7436981228d3b6f70630c15973f48175
Owning collection