Extended Depth-of-Field Projector using Learned Diffractive Optics

Projector Depth-of-Field (DOF) refers to the projection range of projector images in focus. It is a crucial property of projectors in spatial augmented reality (SAR) applications since wide projector DOF can increase the effective projection area on the projection surfaces with large depth variances...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:Proceedings (IEEE Conference on Virtual Reality and 3D User Interfaces. Online) s. 449 - 459
Hlavní autori: Li, Yuqi, Fu, Qiang, Heidrich, Wolfgang
Médium: Konferenčný príspevok..
Jazyk:English
Vydavateľské údaje: IEEE 01.01.2023
Predmet:
ISSN:2642-5254
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:Projector Depth-of-Field (DOF) refers to the projection range of projector images in focus. It is a crucial property of projectors in spatial augmented reality (SAR) applications since wide projector DOF can increase the effective projection area on the projection surfaces with large depth variances and thus reduce the number of projectors required. Existing state-of-the-art methods attempt to create all-in-focus displays by adopting either a deep deblurring network or light modulation. Unlike previous work that considers the optimization of the deblurring model and physic modulation separately, in this paper, we propose an end-to-end joint optimization method to learn a diffractive optical element (DOE) placed in front of a projector lens and a compensation network for deblurring. Using the desired image and the captured projection result image, the compensation network can directly output the compensated image for display. We evaluate the proposed method in physical simulation and with a real experimental prototype, showing that the proposed method can extend the projector DOF by a minor modification to the projector and thus superior to the normal projection with a shallow DOF. The compensation method is also compared with the state-of-the-art methods and shows the advance in radiometric compensation in terms of computational efficiency and image quality.
ISSN:2642-5254
DOI:10.1109/VR55154.2023.00060