Shared Sampling for Real-Time Alpha Matting

Image matting aims at extracting foreground elements from an image by means of color and opacity (alpha) estimation. While a lot of progress has been made in recent years on improving the accuracy of matting techniques, one common problem persisted: the low speed of matte computation. We present the...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Computer graphics forum Jg. 29; H. 2; S. 575 - 584
Hauptverfasser: Gastal, Eduardo S. L., Oliveira, Manuel M.
Format: Journal Article
Sprache:Englisch
Veröffentlicht: Oxford, UK Blackwell Publishing Ltd 01.05.2010
Schlagworte:
ISSN:0167-7055, 1467-8659
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Image matting aims at extracting foreground elements from an image by means of color and opacity (alpha) estimation. While a lot of progress has been made in recent years on improving the accuracy of matting techniques, one common problem persisted: the low speed of matte computation. We present the first real‐time matting technique for natural images and videos. Our technique is based on the observation that, for small neighborhoods, pixels tend to share similar attributes. Therefore, independently treating each pixel in the unknown regions of a trimap results in a lot of redundant work. We show how this computation can be significantly and safely reduced by means of a careful selection of pairs of background and foreground samples. Our technique achieves speedups of up to two orders of magnitude compared to previous ones, while producing high‐quality alpha mattes. The quality of our results has been verified through an independent benchmark. The speed of our technique enables, for the first time, real‐time alpha matting of videos, and has the potential to enable a new class of exciting applications.
Bibliographie:ArticleID:CGF1627
ark:/67375/WNG-18ZX09D0-J
istex:D87E44145F687B3DFF21CD114FFCBC806FF82877
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ObjectType-Article-2
content type line 23
ISSN:0167-7055
1467-8659
DOI:10.1111/j.1467-8659.2009.01627.x