Motion based Painterly Rendering

Previous painterly rendering techniques normally use image gradients for deciding stroke orientations. Image gradients are good for expressing object shapes, but difficult to express the flow or movements of objects. In real painting, the use of brush strokes corresponding to the actual movement of...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:Computer graphics forum Ročník 28; číslo 4; s. 1207 - 1215
Hlavní autori: Lee, H., Lee, C. H., Yoon, K.
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: Oxford, UK Blackwell Publishing Ltd 01.06.2009
Predmet:
ISSN:0167-7055, 1467-8659
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:Previous painterly rendering techniques normally use image gradients for deciding stroke orientations. Image gradients are good for expressing object shapes, but difficult to express the flow or movements of objects. In real painting, the use of brush strokes corresponding to the actual movement of objects allows viewers to recognize objects' motion better and thus to have an impression of the dynamic. In this paper, we propose a novel painterly rendering algorithm to express dynamic objects based on their motion information. We first extract motion information (magnitude, direction, standard deviation) of a scene from a set of consecutive image sequences from the same view. Then the motion directions are used for determining stroke orientations in the regions with significant motions, and image gradients determine stroke orientations where little motion is observed. Our algorithm is useful for realistically and dynamically representing moving objects. We have applied our algorithm for rendering landscapes. We could segment a scene into dynamic and static regions, and express the actual movement of dynamic objects using motion based strokes.
Bibliografia:ark:/67375/WNG-62GMJ5LX-X
ArticleID:CGF1498
istex:D4CAE35FDD9867E73B1EEF874872D6FAEE25CB8D
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ObjectType-Article-2
content type line 23
ISSN:0167-7055
1467-8659
DOI:10.1111/j.1467-8659.2009.01498.x