Interactive Attention Model Explorer for Natural Language Processing Tasks with Unbalanced Data Sizes

Conventional attention visualization tools compromise either the readability or the information conveyed when documents are lengthy, especially when these documents have imbalanced sizes. Our work strives toward a more intuitive visualization for a subset of Natural Language Processing tasks, where...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:IEEE Pacific Visualization Symposium s. 46 - 50
Hlavní autoři: Dong, Zhihang, Wu, Tongshuang, Song, Sicheng, Zhang, Mingrui
Médium: Konferenční příspěvek
Jazyk:angličtina
Vydáno: IEEE 01.06.2020
Témata:
ISSN:2165-8773
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Conventional attention visualization tools compromise either the readability or the information conveyed when documents are lengthy, especially when these documents have imbalanced sizes. Our work strives toward a more intuitive visualization for a subset of Natural Language Processing tasks, where attention is mapped between documents with imbalanced sizes. We extend the flow map visualization to enhance the readability of the attention-augmented documents. Through interaction, our design enables semantic filtering that helps users prioritize important tokens and meaningful matching for an in-depth exploration. Case studies and informal user studies in machine comprehension prove that our visualization effectively helps users gain initial understandings about what their models are "paying attention to." We discuss how the work can be extended to other domains, as well as being plugged into more end-to-end systems for model error analysis.
ISSN:2165-8773
DOI:10.1109/PacificVis48177.2020.1031