Interactive Attention Model Explorer for Natural Language Processing Tasks with Unbalanced Data Sizes

Conventional attention visualization tools compromise either the readability or the information conveyed when documents are lengthy, especially when these documents have imbalanced sizes. Our work strives toward a more intuitive visualization for a subset of Natural Language Processing tasks, where...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE Pacific Visualization Symposium S. 46 - 50
Hauptverfasser: Dong, Zhihang, Wu, Tongshuang, Song, Sicheng, Zhang, Mingrui
Format: Tagungsbericht
Sprache:Englisch
Veröffentlicht: IEEE 01.06.2020
Schlagworte:
ISSN:2165-8773
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Conventional attention visualization tools compromise either the readability or the information conveyed when documents are lengthy, especially when these documents have imbalanced sizes. Our work strives toward a more intuitive visualization for a subset of Natural Language Processing tasks, where attention is mapped between documents with imbalanced sizes. We extend the flow map visualization to enhance the readability of the attention-augmented documents. Through interaction, our design enables semantic filtering that helps users prioritize important tokens and meaningful matching for an in-depth exploration. Case studies and informal user studies in machine comprehension prove that our visualization effectively helps users gain initial understandings about what their models are "paying attention to." We discuss how the work can be extended to other domains, as well as being plugged into more end-to-end systems for model error analysis.
ISSN:2165-8773
DOI:10.1109/PacificVis48177.2020.1031