Modeling Luminance Perception at Absolute Threshold

When human luminance perception operates close to its absolute threshold, i. e., the lowest perceivable absolute values, appearance changes substantially compared to common photopic or scotopic vision. In particular, most observers report perceiving temporally‐varying noise. Two reasons are physiolo...

Full description

Saved in:
Bibliographic Details
Published in:Computer graphics forum Vol. 34; no. 4; pp. 155 - 164
Main Authors: Kellnhofer, Petr, Ritschel, Tobias, Myszkowski, Karol, Eisemann, Elmar, Seidel, Hans-Peter
Format: Journal Article
Language:English
Published: Oxford Blackwell Publishing Ltd 01.07.2015
Subjects:
ISSN:0167-7055, 1467-8659
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:When human luminance perception operates close to its absolute threshold, i. e., the lowest perceivable absolute values, appearance changes substantially compared to common photopic or scotopic vision. In particular, most observers report perceiving temporally‐varying noise. Two reasons are physiologically plausible; quantum noise (due to the low absolute number of photons) and spontaneous photochemical reactions. Previously, static noise with a normal distribution and no account for absolute values was combined with blue hue shift and blur to simulate scotopic appearance on a photopic display for movies and interactive applications (e.g., games). We present a computational model to reproduce the specific distribution and dynamics of “scotopic noise” for specific absolute values. It automatically introduces a perceptually‐calibrated amount of noise for a specific luminance level and supports animated imagery. Our simulation runs in milliseconds at HD resolution using graphics hardware and favorably compares to simpler alternatives in a perceptual experiment.
Bibliography:ark:/67375/WNG-W09S0Q4N-G
ArticleID:CGF12687
istex:0D66366838F953C92701278799D0562CB08DEE8A
Supporting Information
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ObjectType-Article-1
ObjectType-Feature-2
content type line 23
ISSN:0167-7055
1467-8659
DOI:10.1111/cgf.12687