Span-based relational graph transformer network for aspect–opinion pair extraction

Aspect extraction and opinion extraction are two fundamental subtasks in aspect-based sentiment analysis. Many methods extract aspect terms or opinion terms but ignore the relationships between them. However, such relationships are crucial for downstream tasks, such as sentiment classification and c...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Knowledge and information systems Jg. 64; H. 5; S. 1305 - 1322
Hauptverfasser: Li, You, Wang, Chaoqiang, Lin, Yuming, Lin, Yongdong, Chang, Liang
Format: Journal Article
Sprache:Englisch
Veröffentlicht: London Springer London 01.05.2022
Springer Nature B.V
Schlagworte:
ISSN:0219-1377, 0219-3116
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Aspect extraction and opinion extraction are two fundamental subtasks in aspect-based sentiment analysis. Many methods extract aspect terms or opinion terms but ignore the relationships between them. However, such relationships are crucial for downstream tasks, such as sentiment classification and commodity recommendation. Recently, methods have been proposed to extract both terms jointly; however, they fail to extract them as pairs. In this paper, we explore the aspect–opinion pair extraction task that aims to extract aspect and opinion terms in pairs. To carry out this task, we propose a span-based relational graph transformer network that consists of a span generator, a span classifier, and a relation detector. The span generator enumerates all possible spans to generate the candidates for aspect or opinion terms and filters non-aspects or non-opinions terms, while the relation classifier extracts aspect–opinion pairs. We propose a relational graph convolutional network to capture the dependent relationships between aspect and opinion terms. Extensive experiments show that the proposed model achieves the state-of-the-art performance using four benchmark datasets.
Bibliographie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0219-1377
0219-3116
DOI:10.1007/s10115-022-01675-8