A hybrid model utilizing transfer learning for legal citation linking

The advent of transfer learning and its applications in Natural Language Processing (NLP) open the path to rule legal text data processing tasks. Identifying and citing relevant laws and forums related to the legal text document is challenging for lawyers and other legal professionals. The main aim...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:International journal of information technology (Singapore. Online) Jg. 15; H. 5; S. 2783 - 2792
Hauptverfasser: Sheik, Reshma, Parida, Swati Sampada, Nirmala, S. Jaya
Format: Journal Article
Sprache:Englisch
Veröffentlicht: Singapore Springer Nature Singapore 01.06.2023
Springer Nature B.V
Schlagworte:
ISSN:2511-2104, 2511-2112
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The advent of transfer learning and its applications in Natural Language Processing (NLP) open the path to rule legal text data processing tasks. Identifying and citing relevant laws and forums related to the legal text document is challenging for lawyers and other legal professionals. The main aim of the work is to link paragraphs from US Supreme Court cases to sections of the US Constitution. Linking amendments or relevant statutes to a legal text provides enormous opportunities in legal assistive writing. The paper proposes a neural network architecture by exploiting transfer learning using the pre-trained Bidirectional Encoder Representation from Transformers (BERT) model and Bidirectional Gated Recurrent Unit (BiGRU) to effectively capture long-term dependency. Then, various forms of hybrid models were implemented, combining a linear classifier, a naive rule-based classifier, and the neural network model. Research on the existing dataset demonstrates that our suggested hybrid models learn contextual information well and produce the best overall results, with an increase of 12% in the F1 score compared to the state-of-the-art baseline models.
Bibliographie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2511-2104
2511-2112
DOI:10.1007/s41870-023-01323-6