Please use this identifier to cite or link to this item: http://cmuir.cmu.ac.th/jspui/handle/6653943832/72915
Full metadata record
DC FieldValueLanguage
dc.contributor.authorFengzhu Xuen_US
dc.contributor.authorMeijing Lien_US
dc.contributor.authorKeun Ho Ryuen_US
dc.date.accessioned2022-05-27T08:31:51Z-
dc.date.available2022-05-27T08:31:51Z-
dc.date.issued2022-01-01en_US
dc.identifier.issn18761119en_US
dc.identifier.issn18761100en_US
dc.identifier.other2-s2.0-85123293152en_US
dc.identifier.other10.1007/978-981-16-8430-2_18en_US
dc.identifier.urihttps://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=85123293152&origin=inwarden_US
dc.identifier.urihttp://cmuir.cmu.ac.th/jspui/handle/6653943832/72915-
dc.description.abstractIn the field of distant supervision relation extraction, PCNN (piecewise convolution neural network) is normally involved to trap local features of sentences, and has achieved good results. However, the existing PCNN-based methods are unable to capture the features of long-distance interdependence in sentences and cannot distinguish the influence of the three segments of PCNN on relation classification, which probably results in missing some vital information. To address the above two issues, we come up with a new model with multi-head self-attention and gate mechanism for distant supervised relation extraction. Firstly, to capture the features of long-distance interdependence in sentences, we employ an internal multi-head self-attention in PCNN model, which can grab information in different representation subspaces. Secondly, to distinguish the influence of the three segments of the piecewise max-pooling output on the relation classification, the gate mechanism is introduced to assign different weights to the three segments and highlight important segments. After a series of experiments, it is proven that the model we presented is better than the previous approaches on each evaluation criterion.en_US
dc.subjectEngineeringen_US
dc.titleDistantly Supervised Relation Extraction Based on Multi-head Self-attention and Gate Mechanismen_US
dc.typeBook Seriesen_US
article.title.sourcetitleLecture Notes in Electrical Engineeringen_US
article.volume833 LNEEen_US
article.stream.affiliationsTon-Duc-Thang Universityen_US
article.stream.affiliationsShanghai Maritime Universityen_US
article.stream.affiliationsChungbuk National Universityen_US
article.stream.affiliationsChiang Mai Universityen_US
Appears in Collections:CMUL: Journal Articles

Files in This Item:
There are no files associated with this item.


Items in CMUIR are protected by copyright, with all rights reserved, unless otherwise indicated.