Distant Supervised Relation Extraction via DiSAN-2CNN on a Feature Level

Distant Supervised Relation Extraction via DiSAN-2CNN on a Feature Level

Xueqiang Lv, Huixin Hou, Xindong You, Xiaopeng Zhang, Junmei Han
Copyright: © 2020 |Pages: 17
DOI: 10.4018/IJSWIS.2020040101
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

At present, the mainstream distant supervised relation extraction methods existed problems: the coarse granularity for coding the context feature information; the difficulty in capturing the long-term dependency in the sentence, and the difficulty in coding prior knowledge of structures are major issues. To address these problems, we propose a distant supervised relation extraction model via DiSAN-2CNN on feature level, in which multi-dimension self-attention mechanism is utilized to encode the features of the words and DiSAN-2CNN is used to encode the sentence to obtain the long-term dependency, the prior knowledge of the structure, the time sequence, and the entity dependence in the sentence. Experiments conducted on the NYT-Freebase benchmark dataset demonstrate that the proposed DiSAN-2CNN on a feature level model achieves better performance than the current two state-of-art distant supervised relation extraction models PCNN+ATT and ResCNN-9, and it has d generalization ability with the least artificial feature engineering.
Article Preview
Top

Introduction

The Relation extraction is one of the most important Natural Language Processing (NLP) tasks. It plays a very important role in many NLP-related fields, such as building a knowledge graph and improving the accuracy and readability of machine translation, automatic question answering and so on. Many experts and scholars have made many contributions to improving the extraction efficiency for entity relation. Most of the supervised methods for relation extraction need to manually label large amount of data, which not only requires lots of manpower and material resources, but also requires the participation of domain experts. Recently, the deep learning technology has been successfully applied to many NLP tasks, including machine translation, auto question answering, part of speech tagging, emotion analysis, and semantic analysis etc. In the past several years, the distant supervised model with NLP technology is one of the mainstream methods for the relation extraction. Distant supervised relation extraction was first proposed by Mintz in ACL2009. Since then, many experts have conducted in-depth research in this area. (Zeng et al., 2015) exploited Piecewise Convolution Neural Networks (PCNN) with multi-instance learning to automatically extract features from sentences and applied MIL to select the most important sentence. (Lin et al., 2016).) applied attention mechanism to alleviate the inference of noises. (Quirk, & Poon, 2016). propose the first approach for applying distant supervision to cross sentence relation extraction. (Qu et al., 2017) presented REPEL model to improve the performance of relational extraction. Many researchers utilize the deep neural network to automatically learn the features on the task of relation extraction in the distant supervised methods.

However, the current mainstream distant supervised relation extraction methods exhibit the following problems:

  • The granularity of coding the context feature information is coarse, usually just on word level;

  • Only words distance is considered, which lead to the difficulty in capturing the long-term dependency in the sentence, and the difficulty in coding prior knowledge of structures. For example, (Lin et al., 2016)’s experiment only considers spatial information.

This paper aims to design better models to overcome the shortcomings of the current distant supervised models and enhance the performance of the relation extraction tasks ultimately. On the basis of an in-depth study for the convolutional neural networks and Directional Self-Attention Network (DiSAN) model, we changed the structure of Convolutional Neural Networks (CNN) convolutional pool layer to reduce the dimension of feature vectors for a different sentence and proposed a distant supervised relation extraction method of neural networks via DiSAN-2CNN on the feature level. We proposed DiSAN-2CNN model can not only encode the sentence sequences on feature level through the multi-dimensional self-attention mechanism, but also encode the information of time series, entity dependency relationship and the prior knowledge, which is important for improving the extraction efficiency in the field of relation extraction. In addition, inspired by the idea of bidirectional Long Short-Term Memory (LSTM), we solve the problem of long-term dependency by changing the threshold of fusion gate to control the flow of information. The experimental results conducted on NYT-Freebase benchmark dataset Riedel et al. (2010) show that the proposed distant supervised relation extraction method via DiSAN-2CNN on feature level is superior to the current two state-of-art distant supervised method. The 9 layers deep residual convolution neural network (ResCNN-9) and Piecewise Convolutional Neural Networks + Attention (PCNN+ATT).

Complete Article List

Search this Journal:
Reset
Volume 20: 1 Issue (2024)
Volume 19: 1 Issue (2023)
Volume 18: 4 Issues (2022): 2 Released, 2 Forthcoming
Volume 17: 4 Issues (2021)
Volume 16: 4 Issues (2020)
Volume 15: 4 Issues (2019)
Volume 14: 4 Issues (2018)
Volume 13: 4 Issues (2017)
Volume 12: 4 Issues (2016)
Volume 11: 4 Issues (2015)
Volume 10: 4 Issues (2014)
Volume 9: 4 Issues (2013)
Volume 8: 4 Issues (2012)
Volume 7: 4 Issues (2011)
Volume 6: 4 Issues (2010)
Volume 5: 4 Issues (2009)
Volume 4: 4 Issues (2008)
Volume 3: 4 Issues (2007)
Volume 2: 4 Issues (2006)
Volume 1: 4 Issues (2005)
View Complete Journal Contents Listing