Aspect-Based Sentiment Analysis Using Local Context Focus Mechanism with DeBERTa
Text sentiment analysis, often termed as opinion mining, delves into quantifying individuals' opinions, evaluatio ns, attitudes, and emotions conveyed about entities. Sentiment a nalysis of text can be categorized into text-level, sentence-level, and aspect-level analyses. Aspect-Based Sentimen...
Uloženo v:
| Vydáno v: | 2023 5th International Conference on Data-driven Optimization of Complex Systems (DOCS) s. 1 - 6 |
|---|---|
| Hlavní autoři: | , , , |
| Médium: | Konferenční příspěvek |
| Jazyk: | angličtina |
| Vydáno: |
IEEE
22.09.2023
|
| Témata: | |
| On-line přístup: | Získat plný text |
| Tagy: |
Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
|
| Shrnutí: | Text sentiment analysis, often termed as opinion mining, delves into quantifying individuals' opinions, evaluatio ns, attitudes, and emotions conveyed about entities. Sentiment a nalysis of text can be categorized into text-level, sentence-level, and aspect-level analyses. Aspect-Based Sentiment Analysis (A BSA) represents a detailed sub-discipline within sentiment anal ysis, with its primary goal being to ascertain the sentiment pola rity of specific aspects. The research of pre-training neural mod el has significantly improved the performance of many natural language processing tasks. In recent years, pre training model (PTM) has been applied in ABSA. Therefore, there has been a q uestion, which is whether PTMs contain sufficient syntactic info rmation for ABSA. In this paper, we explored the recent DeBE RTa model (Decoding-enhanced BERT with disentangled attention) to solve Aspect-Based Sentiment Analysis problem. DeBE RTa is a kind of neural language model based on transformer, which uses self-supervised learning to pre-train on a large num ber of original text corpora. Based on the Local Context Focus (LCF) mechanism, by integrating DeBERTa model, we purpos e a multi-task learning model for aspect-based sentiment analys is. The experiments result on the most commonly used the lapto p and restaurant datasets of SemEval-2014 and the ACL twitte r dataset show that LCF mechanism with DeBERTa has signifi cant improvement. |
|---|---|
| DOI: | 10.1109/DOCS60977.2023.10294548 |