ManyLoDs: Parallel Many-View Level-of-Detail Selection for Real-Time Global Illumination

Level‐of‐Detail structures are a key component for scalable rendering. Built from raw 3D data, these structures are often defined as Bounding Volume Hierarchies, providing coarse‐to‐fine adaptive approximations that are well‐adapted for many‐view rasterization. Here, the total number of pixels in ea...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:Computer graphics forum Ročník 30; číslo 4; s. 1233 - 1240
Hlavní autori: Hollander, Matthias, Ritschel, Tobias, Eisemann, Elmar, Boubekeur, Tamy
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: Oxford, UK Blackwell Publishing Ltd 01.06.2011
Predmet:
ISSN:0167-7055, 1467-8659
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:Level‐of‐Detail structures are a key component for scalable rendering. Built from raw 3D data, these structures are often defined as Bounding Volume Hierarchies, providing coarse‐to‐fine adaptive approximations that are well‐adapted for many‐view rasterization. Here, the total number of pixels in each view is usually low, while the cost of choosing the appropriate LoD for each view is high. This task represents a challenge for existing GPU algorithms. We propose ManyLoDs, a new GPU algorithm to efficiently compute many LoDs from a Bounding Volume Hierarchy in parallel by balancing the workload within and among LoDs. Our approach is not specific to a particular rendering technique, can be used on lazy representations such as polygon soups, and can handle dynamic scenes. We apply our method to various many‐view rasterization applications, including Instant Radiosity, Point‐Based Global Illumination, and reflection/refraction mapping. For each of these, we achieve real‐time performance in complex scenes at high resolutions.
Bibliografia:ArticleID:CGF1982
ark:/67375/WNG-MFBZJ22X-M
istex:D90A2EEE085254369DBC9B29DA022B0978B6D695
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ObjectType-Article-2
content type line 23
ISSN:0167-7055
1467-8659
DOI:10.1111/j.1467-8659.2011.01982.x