ManyLoDs: Parallel Many-View Level-of-Detail Selection for Real-Time Global Illumination

Level‐of‐Detail structures are a key component for scalable rendering. Built from raw 3D data, these structures are often defined as Bounding Volume Hierarchies, providing coarse‐to‐fine adaptive approximations that are well‐adapted for many‐view rasterization. Here, the total number of pixels in ea...

Full description

Saved in:
Bibliographic Details
Published in:Computer graphics forum Vol. 30; no. 4; pp. 1233 - 1240
Main Authors: Hollander, Matthias, Ritschel, Tobias, Eisemann, Elmar, Boubekeur, Tamy
Format: Journal Article
Language:English
Published: Oxford, UK Blackwell Publishing Ltd 01.06.2011
Subjects:
ISSN:0167-7055, 1467-8659
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Level‐of‐Detail structures are a key component for scalable rendering. Built from raw 3D data, these structures are often defined as Bounding Volume Hierarchies, providing coarse‐to‐fine adaptive approximations that are well‐adapted for many‐view rasterization. Here, the total number of pixels in each view is usually low, while the cost of choosing the appropriate LoD for each view is high. This task represents a challenge for existing GPU algorithms. We propose ManyLoDs, a new GPU algorithm to efficiently compute many LoDs from a Bounding Volume Hierarchy in parallel by balancing the workload within and among LoDs. Our approach is not specific to a particular rendering technique, can be used on lazy representations such as polygon soups, and can handle dynamic scenes. We apply our method to various many‐view rasterization applications, including Instant Radiosity, Point‐Based Global Illumination, and reflection/refraction mapping. For each of these, we achieve real‐time performance in complex scenes at high resolutions.
Bibliography:ArticleID:CGF1982
ark:/67375/WNG-MFBZJ22X-M
istex:D90A2EEE085254369DBC9B29DA022B0978B6D695
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ObjectType-Article-2
content type line 23
ISSN:0167-7055
1467-8659
DOI:10.1111/j.1467-8659.2011.01982.x