gazeMapper: A tool for automated world-based analysis of gaze data from one or multiple wearable eye trackers

Gespeichert in:
Bibliographische Detailangaben
Titel: gazeMapper: A tool for automated world-based analysis of gaze data from one or multiple wearable eye trackers
Autoren: Niehorster, Diederick C., Hessels, Roy S., Nyström, Marcus, Benjamins, Jeroen S., Hooge, Ignace T. C.
Weitere Verfasser: Lund University, Joint Faculties of Humanities and Theology, Units, Lund University Humanities Lab, Lunds universitet, Humanistiska och teologiska fakulteterna, Fakultetsgemensamma verksamheter, Humanistlaboratoriet, Originator, Lund University, Faculty of Social Sciences, Departments of Administrative, Economic and Social Sciences, Department of Psychology, Lunds universitet, Samhällsvetenskapliga fakulteten, Samhällsvetenskapliga institutioner och centrumbildningar, Institutionen för psykologi, Originator
Quelle: Behavior Research Methods. 57(7)
Schlagwörter: Natural Sciences, Computer and Information Sciences, Human Computer Interaction, Naturvetenskap, Data- och informationsvetenskap (Datateknik), Människa-datorinteraktion (Interaktionsdesign), Social Sciences, Other Social Sciences, Other Social Sciences not elsewhere specified, Samhällsvetenskap, Annan samhällsvetenskap, Övrig annan samhällsvetenskap
Beschreibung: The problem: wearable eye trackers deliver eye-tracking data on a scene video that is acquired by a camera affixed to the participant’s head. Analyzing and interpreting such head-centered data is difficult and laborious manual work. Automated methods to map eye-tracking data to a world-centered reference frame (e.g., screens and tabletops) are available. These methods usually make use of fiducial markers. However, such mapping methods may be difficult to implement, expensive, and eye tracker-specific. The solution: here we present gazeMapper, an open-source tool for automated mapping and processing of eye-tracking data. gazeMapper can: (1) Transform head-centered data to planes in the world, (2) synchronize recordings from multiple participants, (3) determine data quality measures, e.g., accuracy and precision. gazeMapper comes with a GUI application (Windows, macOS, and Linux) and supports 11 different wearable eye trackers from AdHawk, Meta, Pupil, SeeTrue, SMI, Tobii, and Viewpointsystem. It is also possible to sidestep the GUI and use gazeMapper as a Python library directly.
Zugangs-URL: https://doi.org/10.3758/s13428-025-02704-4
Datenbank: SwePub
Beschreibung
Abstract:The problem: wearable eye trackers deliver eye-tracking data on a scene video that is acquired by a camera affixed to the participant’s head. Analyzing and interpreting such head-centered data is difficult and laborious manual work. Automated methods to map eye-tracking data to a world-centered reference frame (e.g., screens and tabletops) are available. These methods usually make use of fiducial markers. However, such mapping methods may be difficult to implement, expensive, and eye tracker-specific. The solution: here we present gazeMapper, an open-source tool for automated mapping and processing of eye-tracking data. gazeMapper can: (1) Transform head-centered data to planes in the world, (2) synchronize recordings from multiple participants, (3) determine data quality measures, e.g., accuracy and precision. gazeMapper comes with a GUI application (Windows, macOS, and Linux) and supports 11 different wearable eye trackers from AdHawk, Meta, Pupil, SeeTrue, SMI, Tobii, and Viewpointsystem. It is also possible to sidestep the GUI and use gazeMapper as a Python library directly.
ISSN:15543528
DOI:10.3758/s13428-025-02704-4