Eye Tracking Using Artificial Neural Networks for Human Computer Interaction

This paper describes an ongoing project that has the aim to develop a low cost application to replace a computer mouse for people with physical impairment. The application is based on an eye tracking algorithm and assumes that the camera and the head position are fixed. Color tracking and template m...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:Physiological research Ročník 60; číslo 5; s. 841 - 844
Hlavní autori: DEMJÉN, E., ABOŠI, V., TOMORI, Z.
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: Czech Republic Institute of Physiology 01.01.2011
Predmet:
ISSN:0862-8408, 1802-9973, 1802-9973
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:This paper describes an ongoing project that has the aim to develop a low cost application to replace a computer mouse for people with physical impairment. The application is based on an eye tracking algorithm and assumes that the camera and the head position are fixed. Color tracking and template matching methods are used for pupil detection. Calibration is provided by neural networks as well as by parametric interpolation methods. Neural networks use back-propagation for learning and bipolar sigmoid function is chosen as the activation function. The user’s eye is scanned with a simple web camera with backlight compensation which is attached to a head fixation device. Neural networks significantly outperform parametric interpolation techniques: 1) the calibration procedure is faster as they require less calibration marks and 2) cursor control is more precise. The system in its current stage of development is able to distinguish regions at least on the level of desktop icons. The main limitation of the proposed method is the lack of head-pose invariance and its relative sensitivity to illumination (especially to incidental pupil reflections).
Bibliografia:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ObjectType-Article-1
ObjectType-Feature-2
content type line 23
ISSN:0862-8408
1802-9973
1802-9973
DOI:10.33549/physiolres.932117