Visual Boosting in Pixel-based Visualizations

Pixel‐based visualizations have become popular, because they are capable of displaying large amounts of data and at the same time provide many details. However, pixel‐based visualizations are only effective if the data set is not sparse and the data distribution not random. Single pixels – no matter...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:Computer graphics forum Ročník 30; číslo 3; s. 871 - 880
Hlavní autori: Oelke, Daniela, Janetzko, Halldor, Simon, Svenja, Neuhaus, Klaus, Keim, Daniel A.
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: Oxford, UK Blackwell Publishing Ltd 01.06.2011
Predmet:
ISSN:0167-7055, 1467-8659
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:Pixel‐based visualizations have become popular, because they are capable of displaying large amounts of data and at the same time provide many details. However, pixel‐based visualizations are only effective if the data set is not sparse and the data distribution not random. Single pixels – no matter if they are in an empty area or in the middle of a large area of differently colored pixels – are perceptually difficult to discern and may therefore easily be missed. Furthermore, trends and interesting passages may be camouflaged in the sea of details. In this paper we compare different approaches for visual boosting in pixel‐based visualizations. Several boosting techniques such as halos, background coloring, distortion, and hatching are discussed and assessed with respect to their effectiveness in boosting single pixels, trends, and interesting passages. Application examples from three different domains (document analysis, genome analysis, and geospatial analysis) show the general applicability of the techniques and the derived guidelines.
Bibliografia:ark:/67375/WNG-6KWCCMLV-S
istex:B45AB28DA3F203049E8DC982409FBEE38A701539
ArticleID:CGF1936
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ObjectType-Article-2
content type line 23
ISSN:0167-7055
1467-8659
DOI:10.1111/j.1467-8659.2011.01936.x