Compositing for small cameras
To achieve a realistic integration of virtual and real imagery in video see-through augmented reality, the rendered images should have a similar appearance and quality to those captured by the video camera. This paper describes a compositing method which models the artefacts produced by a small low-...
Uloženo v:
| Vydáno v: | 2008 7th IEEE International Symposium on Mixed and Augmented Reality s. 57 - 60 |
|---|---|
| Hlavní autoři: | , |
| Médium: | Konferenční příspěvek |
| Jazyk: | angličtina |
| Vydáno: |
Washington, DC, USA
IEEE Computer Society
15.09.2008
IEEE |
| Edice: | ACM Other Conferences |
| Témata: |
Computing methodologies
> Modeling and simulation
> Model development and analysis
> Model verification and validation
Computing methodologies
> Modeling and simulation
> Model development and analysis
> Modeling methodologies
Human-centered computing
> Human computer interaction (HCI)
> Interaction paradigms
> Mixed
> augmented reality
|
| ISBN: | 9781424428403, 1424428408 |
| On-line přístup: | Získat plný text |
| Tagy: |
Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
|
| Shrnutí: | To achieve a realistic integration of virtual and real imagery in video see-through augmented reality, the rendered images should have a similar appearance and quality to those captured by the video camera. This paper describes a compositing method which models the artefacts produced by a small low-cost camera, and adds these effects to an ideal pinhole image produced by conventional rendering methods. We attempt to model and simulate each step of the imaging process, including distortions, chromatic aberrations, blur, bayer masking, noise and colour-space compression, all while requiring only an RGBA image and an estimate of camera velocity as inputs. |
|---|---|
| ISBN: | 9781424428403 1424428408 |
| DOI: | 10.1109/ISMAR.2008.4637324 |

