Exploring 3D Interaction with Gaze Guidance in Augmented Reality

Recent research based on hand-eye coordination has shown that gaze could improve object selection and translation experience under certain scenarios in AR. However, several limitations still exist. Specifically, we investigate whether gaze could help object selection with heavy 3D occlusions and hel...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:Proceedings (IEEE Conference on Virtual Reality and 3D User Interfaces. Online) s. 22 - 32
Hlavní autori: Bao, Yiwei, Wang, Jiaxi, Wang, Zhimin, Lu, Feng
Médium: Konferenčný príspevok..
Jazyk:English
Vydavateľské údaje: IEEE 01.03.2023
Predmet:
ISSN:2642-5254
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:Recent research based on hand-eye coordination has shown that gaze could improve object selection and translation experience under certain scenarios in AR. However, several limitations still exist. Specifically, we investigate whether gaze could help object selection with heavy 3D occlusions and help 3D object translation in the depth dimension. In addition, we also investigate the possibility of reducing the gaze calibration burden before use. Therefore, we develop new methods with proper gaze guidance for 3D interaction in AR, and also an implicit online calibration method. We conduct two user studies to evaluate different interaction methods and the results show that our methods not only improve the effectiveness of occluded objects selection but also alleviate the arm fatigue problem significantly in the depth translation task. We also evaluate the proposed implicit online calibration method and find its accuracy comparable to standard 9 points explicit calibration, which makes a step towards practical use in the real world.
ISSN:2642-5254
DOI:10.1109/VR55154.2023.00018