Exploring 3D Interaction with Gaze Guidance in Augmented Reality

Recent research based on hand-eye coordination has shown that gaze could improve object selection and translation experience under certain scenarios in AR. However, several limitations still exist. Specifically, we investigate whether gaze could help object selection with heavy 3D occlusions and hel...

Full description

Saved in:
Bibliographic Details
Published in:Proceedings (IEEE Conference on Virtual Reality and 3D User Interfaces. Online) pp. 22 - 32
Main Authors: Bao, Yiwei, Wang, Jiaxi, Wang, Zhimin, Lu, Feng
Format: Conference Proceeding
Language:English
Published: IEEE 01.03.2023
Subjects:
ISSN:2642-5254
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Recent research based on hand-eye coordination has shown that gaze could improve object selection and translation experience under certain scenarios in AR. However, several limitations still exist. Specifically, we investigate whether gaze could help object selection with heavy 3D occlusions and help 3D object translation in the depth dimension. In addition, we also investigate the possibility of reducing the gaze calibration burden before use. Therefore, we develop new methods with proper gaze guidance for 3D interaction in AR, and also an implicit online calibration method. We conduct two user studies to evaluate different interaction methods and the results show that our methods not only improve the effectiveness of occluded objects selection but also alleviate the arm fatigue problem significantly in the depth translation task. We also evaluate the proposed implicit online calibration method and find its accuracy comparable to standard 9 points explicit calibration, which makes a step towards practical use in the real world.
ISSN:2642-5254
DOI:10.1109/VR55154.2023.00018