Gaze-enabled activity recognition for augmented reality feedback

Head-mounted Augmented Reality (AR) displays overlay digital information on physical objects. Through eye tracking, they provide insights into user attention, intentions, and activities, and allow novel interaction methods based on this information. However, in physical environments, the implication...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Computers & graphics Ročník 119; s. 103909
Hlavní autoři: Bektaş, Kenan, Strecker, Jannis, Mayer, Simon, Garcia, Kimberly
Médium: Journal Article
Jazyk:angličtina
Vydáno: Elsevier Ltd 01.04.2024
Témata:
ISSN:0097-8493, 1873-7684
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Head-mounted Augmented Reality (AR) displays overlay digital information on physical objects. Through eye tracking, they provide insights into user attention, intentions, and activities, and allow novel interaction methods based on this information. However, in physical environments, the implications of using gaze-enabled AR for human activity recognition have not been explored in detail. In an experimental study with the Microsoft HoloLens 2, we collected gaze data from 20 users while they performed three activities: Reading a text, Inspecting a device, and Searching for an object. We trained machine learning models (SVM, Random Forest, Extremely Randomized Trees) with extracted features and achieved up to 89.6% activity-recognition accuracy. Based on the recognized activity, our system—GEAR—then provides users with relevant AR feedback. Due to the sensitivity of the personal (gaze) data GEAR collects, the system further incorporates a novel solution based on the Solid specification for giving users fine-grained control over the sharing of their data. The provided code and anonymized datasets may be used to reproduce and extend our findings, and as teaching material.
ISSN:0097-8493
1873-7684
DOI:10.1016/j.cag.2024.103909