Fast Photo-Realistic Rendering of Trees in Daylight

We propose a fast approach for photo‐realistic rendering of trees under various kinds of daylight, which is particularlyuseful for the environmental assessment of landscapes. In our approach the 3D tree models are transformedto a quasi‐3D tree database registering geometrical and shading information...

Full description

Saved in:
Bibliographic Details
Published in:Computer graphics forum Vol. 22; no. 3; pp. 243 - 252
Main Authors: Qin, Xueying, Nakamae, Eihachiro, Tadamura, Katsumi, Nagai, Yasuo
Format: Journal Article
Language:English
Published: Oxford, UK Blackwell Publishing, Inc 01.09.2003
ISSN:0167-7055, 1467-8659
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:We propose a fast approach for photo‐realistic rendering of trees under various kinds of daylight, which is particularlyuseful for the environmental assessment of landscapes. In our approach the 3D tree models are transformedto a quasi‐3D tree database registering geometrical and shading information of tree surfaces, i.e. their normalvectors, relative depth, and shadowing of direct sunlight and skylight, by using a combination of 2D buffers.Thus the rendering speed of quasi‐3D trees depends on their display sizes only, regardless of the complexity oftheir original 3D tree models. By utilizing a two‐step shadowing algorithm, our proposed method can create highquality forest scenes illuminated by both sunlight and skylight at a low cost. It can generate both umbrae andpenumbrae on a tree cast by other trees and any other objects such as buildings or clouds. Transparency, specularreflection and inter‐reflection of leaves, which influence the delicate shading effects of trees, can also be simulatedwith verisimilitude. Categories and Subject Descriptors (according to ACM CCS): I.3.7 [Computer Graphics]: Three dimensional Graphics and Realism
Bibliography:ArticleID:CGF671
istex:1F0B07D57D81E23CA68E862D5950BACD6323626B
ark:/67375/WNG-V70H4KCG-9
ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 23
ISSN:0167-7055
1467-8659
DOI:10.1111/1467-8659.00671