Visual Boosting in Pixel-based Visualizations

Pixel‐based visualizations have become popular, because they are capable of displaying large amounts of data and at the same time provide many details. However, pixel‐based visualizations are only effective if the data set is not sparse and the data distribution not random. Single pixels – no matter...

Full description

Saved in:
Bibliographic Details
Published in:Computer graphics forum Vol. 30; no. 3; pp. 871 - 880
Main Authors: Oelke, Daniela, Janetzko, Halldor, Simon, Svenja, Neuhaus, Klaus, Keim, Daniel A.
Format: Journal Article
Language:English
Published: Oxford, UK Blackwell Publishing Ltd 01.06.2011
Subjects:
ISSN:0167-7055, 1467-8659
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Pixel‐based visualizations have become popular, because they are capable of displaying large amounts of data and at the same time provide many details. However, pixel‐based visualizations are only effective if the data set is not sparse and the data distribution not random. Single pixels – no matter if they are in an empty area or in the middle of a large area of differently colored pixels – are perceptually difficult to discern and may therefore easily be missed. Furthermore, trends and interesting passages may be camouflaged in the sea of details. In this paper we compare different approaches for visual boosting in pixel‐based visualizations. Several boosting techniques such as halos, background coloring, distortion, and hatching are discussed and assessed with respect to their effectiveness in boosting single pixels, trends, and interesting passages. Application examples from three different domains (document analysis, genome analysis, and geospatial analysis) show the general applicability of the techniques and the derived guidelines.
Bibliography:ark:/67375/WNG-6KWCCMLV-S
istex:B45AB28DA3F203049E8DC982409FBEE38A701539
ArticleID:CGF1936
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ObjectType-Article-2
content type line 23
ISSN:0167-7055
1467-8659
DOI:10.1111/j.1467-8659.2011.01936.x