Event-Based Near-Eye Gaze Tracking Beyond 10,000 Hz

The cameras in modern gaze-tracking systems suffer from fundamental bandwidth and power limitations, constraining data acquisition speed to 300 Hz realistically. This obstructs the use of mobile eye trackers to perform, e.g., low latency predictive rendering, or to study quick and subtle eye motions...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:IEEE transactions on visualization and computer graphics Ročník 27; číslo 5; s. 2577 - 2586
Hlavní autoři: Angelopoulos, Anastasios N., Martel, Julien N.P., Kohli, Amit P., Conradt, Jorg, Wetzstein, Gordon
Médium: Journal Article
Jazyk:angličtina
Vydáno: United States IEEE 01.05.2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Témata:
ISSN:1077-2626, 1941-0506, 1941-0506
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Abstract The cameras in modern gaze-tracking systems suffer from fundamental bandwidth and power limitations, constraining data acquisition speed to 300 Hz realistically. This obstructs the use of mobile eye trackers to perform, e.g., low latency predictive rendering, or to study quick and subtle eye motions like microsaccades using head-mounted devices in the wild. Here, we propose a hybrid frame-event-based near-eye gaze tracking system offering update rates beyond 10,000 Hz with an accuracy that matches that of high-end desktop-mounted commercial trackers when evaluated in the same conditions. Our system, previewed in Figure 1, builds on emerging event cameras that simultaneously acquire regularly sampled frames and adaptively sampled events. We develop an online 2D pupil fitting method that updates a parametric model every one or few events. Moreover, we propose a polynomial regressor for estimating the point of gaze from the parametric pupil model in real time. Using the first event-based gaze dataset, we demonstrate that our system achieves accuracies of 0.45°-1.75° for fields of view from 45° to 98°. With this technology, we hope to enable a new generation of ultra-low-latency gaze-contingent rendering and display techniques for virtual and augmented reality.
AbstractList The cameras in modern gaze-tracking systems suffer from fundamental bandwidth and power limitations, constraining data acquisition speed to 300 Hz realistically. This obstructs the use of mobile eye trackers to perform, e.g., low latency predictive rendering, or to study quick and subtle eye motions like microsaccades using head-mounted devices in the wild. Here, we propose a hybrid frame-event-based near-eye gaze tracking system offering update rates beyond 10,000 Hz with an accuracy that matches that of high-end desktop-mounted commercial trackers when evaluated in the same conditions. Our system, previewed in Figure 1, builds on emerging event cameras that simultaneously acquire regularly sampled frames and adaptively sampled events. We develop an online 2D pupil fitting method that updates a parametric model every one or few events. Moreover, we propose a polynomial regressor for estimating the point of gaze from the parametric pupil model in real time. Using the first event-based gaze dataset, we demonstrate that our system achieves accuracies of 0.45°-1.75° for fields of view from 45° to 98°. With this technology, we hope to enable a new generation of ultra-low-latency gaze-contingent rendering and display techniques for virtual and augmented reality.
The cameras in modern gaze-tracking systems suffer from fundamental bandwidth and power limitations, constraining data acquisition speed to 300 Hz realistically. This obstructs the use of mobile eye trackers to perform, e.g., low latency predictive rendering, or to study quick and subtle eye motions like microsaccades using head-mounted devices in the wild. Here, we propose a hybrid frame-event-based near-eye gaze tracking system offering update rates beyond 10,000 Hz with an accuracy that matches that of high-end desktop-mounted commercial trackers when evaluated in the same conditions. Our system, previewed in Figure 1, builds on emerging event cameras that simultaneously acquire regularly sampled frames and adaptively sampled events. We develop an online 2D pupil fitting method that updates a parametric model every one or few events. Moreover, we propose a polynomial regressor for estimating the point of gaze from the parametric pupil model in real time. Using the first event-based gaze dataset, we demonstrate that our system achieves accuracies of 0.45 degrees -1.75 degrees for fields of view from 45 degrees to 98 degrees. With this technology, we hope to enable a new generation of ultra-low-latency gaze-contingent rendering and display techniques for virtual and augmented reality.
The cameras in modern gaze-tracking systems suffer from fundamental bandwidth and power limitations, constraining data acquisition speed to 300 Hz realistically. This obstructs the use of mobile eye trackers to perform, e.g., low latency predictive rendering, or to study quick and subtle eye motions like microsaccades using head-mounted devices in the wild. Here, we propose a hybrid frame-event-based near-eye gaze tracking system offering update rates beyond 10,000 Hz with an accuracy that matches that of high-end desktop-mounted commercial trackers when evaluated in the same conditions. Our system, previewed in Figure 1, builds on emerging event cameras that simultaneously acquire regularly sampled frames and adaptively sampled events. We develop an online 2D pupil fitting method that updates a parametric model every one or few events. Moreover, we propose a polynomial regressor for estimating the point of gaze from the parametric pupil model in real time. Using the first event-based gaze dataset, we demonstrate that our system achieves accuracies of 0.45°-1.75° for fields of view from 45° to 98°. With this technology, we hope to enable a new generation of ultra-low-latency gaze-contingent rendering and display techniques for virtual and augmented reality.The cameras in modern gaze-tracking systems suffer from fundamental bandwidth and power limitations, constraining data acquisition speed to 300 Hz realistically. This obstructs the use of mobile eye trackers to perform, e.g., low latency predictive rendering, or to study quick and subtle eye motions like microsaccades using head-mounted devices in the wild. Here, we propose a hybrid frame-event-based near-eye gaze tracking system offering update rates beyond 10,000 Hz with an accuracy that matches that of high-end desktop-mounted commercial trackers when evaluated in the same conditions. Our system, previewed in Figure 1, builds on emerging event cameras that simultaneously acquire regularly sampled frames and adaptively sampled events. We develop an online 2D pupil fitting method that updates a parametric model every one or few events. Moreover, we propose a polynomial regressor for estimating the point of gaze from the parametric pupil model in real time. Using the first event-based gaze dataset, we demonstrate that our system achieves accuracies of 0.45°-1.75° for fields of view from 45° to 98°. With this technology, we hope to enable a new generation of ultra-low-latency gaze-contingent rendering and display techniques for virtual and augmented reality.
Author Kohli, Amit P.
Conradt, Jorg
Martel, Julien N.P.
Wetzstein, Gordon
Angelopoulos, Anastasios N.
Author_xml – sequence: 1
  givenname: Anastasios N.
  surname: Angelopoulos
  fullname: Angelopoulos, Anastasios N.
  email: angelopoulos@berkeley.edu
  organization: University of California Berkeley
– sequence: 2
  givenname: Julien N.P.
  surname: Martel
  fullname: Martel, Julien N.P.
  email: jnmartel@stanford.edu
  organization: Stanford University
– sequence: 3
  givenname: Amit P.
  surname: Kohli
  fullname: Kohli, Amit P.
  email: apkohli@berkeley.edu
  organization: University of California Berkeley
– sequence: 4
  givenname: Jorg
  surname: Conradt
  fullname: Conradt, Jorg
  email: jconradt@kth.se
  organization: KTH Royal Institute of Technology
– sequence: 5
  givenname: Gordon
  surname: Wetzstein
  fullname: Wetzstein, Gordon
  email: gordonwz@stanford.edu
  organization: Stanford University
BackLink https://www.ncbi.nlm.nih.gov/pubmed/33780340$$D View this record in MEDLINE/PubMed
https://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-298669$$DView record from Swedish Publication Index (Kungliga Tekniska Högskolan)
BookMark eNp90ctOGzEUBmCrApVL-wAVEhqpGxZMejz2-LKEEEIlBJuUreWxT6ghGaf2TKvw9EyUwIJFV_bi-4-O_R-RvTa2SMg3CiNKQf-YPYynowoqOmIgpFT8EzmkmtMSahB7wx2kLCtRiQNylPMTAOVc6c_kgDGpgHE4JGzyF9uuvLQZfXGHNpWTNRZT-4LFLFn3HNrH4hLXsfUFhXMAKG5evpD9uV1k_Lo7j8mv68lsfFPe3k9_ji9uS8eU7EqvuJo7z4DPBcXaWai05eA9Y9Y6Ude-kdr5xkoPjW-GvT0Hy5yonLLcCXZMyu3c_A9XfWNWKSxtWptog7kKDxcmpkfz3P02lVZC6MGfbf0qxT895s4sQ3a4WNgWY59NVYOkXNFaDfT7B_oU-9QOrxkUrSVTnG7U6U71zRL9-wJv3zcAuQUuxZwTzo0Lne1CbLtkw8JQMJuizKYosynK7IoakvRD8m34_zIn20xAxHevmdJcA3sF6BOaCg
CODEN ITVGEA
CitedBy_id crossref_primary_10_1109_TPAMI_2025_3556561
crossref_primary_10_3758_s13428_023_02310_2
crossref_primary_10_1109_ACCESS_2024_3371487
crossref_primary_10_1109_TVCG_2023_3247077
crossref_primary_10_1109_TRO_2024_3454410
crossref_primary_10_3389_frvir_2022_864653
crossref_primary_10_1145_3523230_3523233
crossref_primary_10_1109_JSEN_2024_3510374
crossref_primary_10_1109_TCSVT_2023_3335457
crossref_primary_10_1109_ACCESS_2023_3343152
crossref_primary_10_1016_j_optlastec_2025_112762
crossref_primary_10_1162_pres_a_00398
crossref_primary_10_3390_electronics13142879
crossref_primary_10_1109_TCSI_2024_3441371
crossref_primary_10_1109_TVCG_2025_3549577
crossref_primary_10_1109_TMC_2023_3277577
crossref_primary_10_3389_frvir_2022_822189
crossref_primary_10_1016_j_aej_2025_07_020
crossref_primary_10_1038_s41467_025_56801_1
crossref_primary_10_1109_TVCG_2022_3150504
crossref_primary_10_1145_3678595
crossref_primary_10_1109_ACCESS_2023_3325293
crossref_primary_10_1109_ACCESS_2024_3482459
crossref_primary_10_1109_TVCG_2023_3320212
crossref_primary_10_1109_ACCESS_2023_3328220
crossref_primary_10_1109_TVCG_2023_3320231
crossref_primary_10_1109_TVCG_2024_3444287
crossref_primary_10_1145_3699745
crossref_primary_10_1109_TCSVT_2024_3383597
crossref_primary_10_1109_TPAMI_2024_3474858
crossref_primary_10_1080_10447318_2025_2491187
crossref_primary_10_1109_TCI_2024_3382494
crossref_primary_10_1109_TVCG_2024_3372039
crossref_primary_10_3390_info15080472
crossref_primary_10_1016_j_eswa_2025_128038
crossref_primary_10_1109_TVCG_2025_3549565
crossref_primary_10_1109_TMC_2024_3425928
Cites_doi 10.1109/CVPR.2014.235
10.1016/j.cviu.2004.07.013
10.1145/2857491.2857492
10.1109/ICIP.2012.6467270
10.1111/cgf.13654
10.1145/1344471.1344494
10.1109/AFGR.2000.840620
10.1126/sciadv.aav6187
10.1007/978-1-4615-2724-4_2
10.3928/1081597X-20101215-03
10.1076/vimr.1.3.131.4438
10.1889/JSID17.3.175
10.1109/CVPR.2016.239
10.1137/1031049
10.1016/B0-12-227210-2/00215-6
10.1016/0893-6080(88)90024-X
10.1109/ICCV.2015.428
10.1037/0096-1523.15.3.529
10.1364/JOSA.63.000921
10.1109/ICRA.2014.6906882
10.1145/3204493.3204578
10.7717/peerj.7086
10.1364/AO.24.000527
10.1109/JSSC.2007.914337
10.3758/BF03201553
10.1109/JPROC.2014.2346153
10.1016/j.patrec.2010.02.013
10.5244/C.31.16
10.1109/CVPRW.2018.00290
10.1109/TVCG.2018.2868532
10.1109/ISCAS.2016.7539103
10.3928/1081597X-20091119-01
10.1167/19.10.218c
10.1145/3361330
10.1145/2366145.2366183
10.1145/3173574.3173655
10.1177/0278364917691115
10.1016/j.cviu.2004.07.010
10.1177/0278364914554813
10.1109/72.217194
10.1145/3290605.3300780
10.1109/TPAMI.2007.1049
10.1007/978-3-319-23192-1_4
10.1109/TPAMI.2017.2778103
10.1109/ISCAS.2018.8351411
10.1007/978-3-030-01258-8_46
10.1109/IROS.2018.8593805
10.1145/2857491.2857505
10.1109/ICCV.2017.114
ContentType Journal Article
Copyright Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021
Copyright_xml – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021
DBID 97E
RIA
RIE
AAYXX
CITATION
NPM
7SC
7SP
8FD
JQ2
L7M
L~C
L~D
7X8
ADTPV
AOWAS
D8V
DOI 10.1109/TVCG.2021.3067784
DatabaseName IEEE Xplore (IEEE)
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE Electronic Library (IEL)
CrossRef
PubMed
Computer and Information Systems Abstracts
Electronics & Communications Abstracts
Technology Research Database
ProQuest Computer Science Collection
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
MEDLINE - Academic
SwePub
SwePub Articles
SWEPUB Kungliga Tekniska Högskolan
DatabaseTitle CrossRef
PubMed
Technology Research Database
Computer and Information Systems Abstracts – Academic
Electronics & Communications Abstracts
ProQuest Computer Science Collection
Computer and Information Systems Abstracts
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts Professional
MEDLINE - Academic
DatabaseTitleList

PubMed
Technology Research Database
MEDLINE - Academic
Database_xml – sequence: 1
  dbid: NPM
  name: PubMed
  url: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 2
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://ieeexplore.ieee.org/
  sourceTypes: Publisher
– sequence: 3
  dbid: 7X8
  name: MEDLINE - Academic
  url: https://search.proquest.com/medline
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
EISSN 1941-0506
EndPage 2586
ExternalDocumentID oai_DiVA_org_kth_298669
33780340
10_1109_TVCG_2021_3067784
9389490
Genre orig-research
Journal Article
GrantInformation_xml – fundername: NSF CAREER
  grantid: 1553333
– fundername: ARL
– fundername: Swiss National Foundation (SNF) Fellowship
  grantid: P2EZP2 181817
– fundername: KAUST Office of Sponsored Research
GroupedDBID ---
-~X
.DC
0R~
29I
4.4
53G
5GY
5VS
6IK
97E
AAJGR
AARMG
AASAJ
AAWTH
ABAZT
ABQJQ
ABVLG
ACGFO
ACIWK
AENEX
AETIX
AGQYO
AGSQL
AHBIQ
AI.
AIBXA
AKJIK
AKQYR
ALLEH
ALMA_UNASSIGNED_HOLDINGS
ATWAV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CS3
DU5
EBS
EJD
F5P
HZ~
H~9
IEDLZ
IFIPE
IFJZH
IPLJI
JAVBF
LAI
M43
O9-
OCL
P2P
PQQKQ
RIA
RIE
RNI
RNS
RZB
TN5
VH1
AAYXX
CITATION
AAYOK
NPM
PKN
RIC
RIG
Z5M
7SC
7SP
8FD
JQ2
L7M
L~C
L~D
7X8
ADTPV
AOWAS
D8V
ID FETCH-LOGICAL-c387t-d848fcd304f61e5ca029a40dd33aac655db79cdba7d0bdb941d40a3c62c8a4c63
IEDL.DBID RIE
ISICitedReferencesCount 59
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000661120200008&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 1077-2626
1941-0506
IngestDate Tue Nov 04 16:36:55 EST 2025
Sun Sep 28 12:35:30 EDT 2025
Sun Nov 09 08:35:13 EST 2025
Wed Feb 19 02:28:27 EST 2025
Sat Nov 29 06:05:45 EST 2025
Tue Nov 18 20:52:06 EST 2025
Wed Aug 27 02:30:53 EDT 2025
IsPeerReviewed true
IsScholarly true
Issue 5
Language English
License https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html
https://doi.org/10.15223/policy-029
https://doi.org/10.15223/policy-037
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c387t-d848fcd304f61e5ca029a40dd33aac655db79cdba7d0bdb941d40a3c62c8a4c63
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
PMID 33780340
PQID 2515738418
PQPubID 75741
PageCount 10
ParticipantIDs crossref_citationtrail_10_1109_TVCG_2021_3067784
swepub_primary_oai_DiVA_org_kth_298669
proquest_miscellaneous_2507148158
crossref_primary_10_1109_TVCG_2021_3067784
proquest_journals_2515738418
pubmed_primary_33780340
ieee_primary_9389490
PublicationCentury 2000
PublicationDate 2021-05-01
PublicationDateYYYYMMDD 2021-05-01
PublicationDate_xml – month: 05
  year: 2021
  text: 2021-05-01
  day: 01
PublicationDecade 2020
PublicationPlace United States
PublicationPlace_xml – name: United States
– name: New York
PublicationTitle IEEE transactions on visualization and computer graphics
PublicationTitleAbbrev TVCG
PublicationTitleAlternate IEEE Trans Vis Comput Graph
PublicationYear 2021
Publisher IEEE
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References ref57
ref12
ref59
ref58
ref14
ref53
ref52
ref55
ref11
ref16
ref19
ref18
(ref3) 0
li (ref36) 2020
ref51
ref50
ref46
ref45
ref48
ref47
ref42
ref41
ref44
ref43
kim (ref26) 0; 3
ref49
(ref17) 0
ref8
(ref2) 0
tian (ref56) 0
ref7
ref9
ref4
ref40
gallego (ref20) 2019
ref35
ref34
ref37
ref31
ref30
ref33
ref32
ref39
(ref25) 2020
ref38
(ref54) 2019
(ref1) 0
baluja (ref6) 1994
ak?it (ref5) 2020
ref24
ref23
ref64
ref63
ref22
ref65
ref21
zhu (ref66) 0
duchowski (ref15) 2007; 328
ref28
ref27
ref29
delbrück (ref13) 0
ref60
ref62
damian (ref10) 2007
ref61
References_xml – ident: ref55
  doi: 10.1109/CVPR.2014.235
– ident: ref24
  doi: 10.1016/j.cviu.2004.07.013
– ident: ref62
  doi: 10.1145/2857491.2857492
– ident: ref33
  doi: 10.1109/ICIP.2012.6467270
– ident: ref30
  doi: 10.1111/cgf.13654
– ident: ref57
  doi: 10.1145/1344471.1344494
– start-page: 110
  year: 0
  ident: ref56
  article-title: Dual-state parametric eye tracking
  publication-title: Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat No PR00580)
  doi: 10.1109/AFGR.2000.840620
– ident: ref49
  doi: 10.1126/sciadv.aav6187
– ident: ref41
  doi: 10.1007/978-1-4615-2724-4_2
– year: 0
  ident: ref17
  publication-title: Private Communication 2019 To understand how EyeLink arrived at their accuracy and speed claims we corresponded with them The correspondence is anonymized and attached in the supplement
– ident: ref48
  doi: 10.3928/1081597X-20101215-03
– year: 2019
  ident: ref54
  publication-title: IMX387 Pregius sensor SLVS - EC 8 Lane 12 bit 40 4 FPS operating in typical conditions
– ident: ref34
  doi: 10.1076/vimr.1.3.131.4438
– start-page: 2426
  year: 0
  ident: ref13
  article-title: Activitydriven, event-based vision sensors
  publication-title: Proceedings of 2010 IEEE International Symposium on Circuits and Systems
– ident: ref59
  doi: 10.1889/JSID17.3.175
– ident: ref31
  doi: 10.1109/CVPR.2016.239
– volume: 3
  start-page: 1
  year: 0
  ident: ref26
  article-title: Post-render warp with late input sampling improves aiming under high latency conditions
  publication-title: Proc Comput Graph Interact Tech
– volume: 328
  start-page: 2
  year: 2007
  ident: ref15
  article-title: Eye tracking methodology
  publication-title: Theory and Practice
– ident: ref23
  doi: 10.1137/1031049
– year: 2020
  ident: ref36
  article-title: Optical gaze tracking with spatially-sparse single-pixel detectors
  publication-title: ArXiv Preprint
– ident: ref58
  doi: 10.1016/B0-12-227210-2/00215-6
– ident: ref43
  doi: 10.1016/0893-6080(88)90024-X
– ident: ref63
  doi: 10.1109/ICCV.2015.428
– ident: ref4
  doi: 10.1037/0096-1523.15.3.529
– ident: ref8
  doi: 10.1364/JOSA.63.000921
– ident: ref61
  doi: 10.1109/ICRA.2014.6906882
– ident: ref53
  doi: 10.1145/3204493.3204578
– ident: ref16
  doi: 10.7717/peerj.7086
– ident: ref9
  doi: 10.1364/AO.24.000527
– ident: ref38
  doi: 10.1109/JSSC.2007.914337
– ident: ref64
  doi: 10.3758/BF03201553
– ident: ref50
  doi: 10.1109/JPROC.2014.2346153
– ident: ref37
  doi: 10.1016/j.patrec.2010.02.013
– ident: ref52
  doi: 10.5244/C.31.16
– ident: ref51
  doi: 10.1109/CVPRW.2018.00290
– ident: ref7
  doi: 10.1109/TVCG.2018.2868532
– ident: ref39
  doi: 10.1109/ISCAS.2016.7539103
– year: 2019
  ident: ref20
  article-title: Event-based vision: A survey
  publication-title: ArXiv Preprint
– ident: ref14
  doi: 10.3928/1081597X-20091119-01
– ident: ref27
  doi: 10.1167/19.10.218c
– ident: ref29
  doi: 10.1145/3361330
– ident: ref22
  doi: 10.1145/2366145.2366183
– year: 2020
  ident: ref5
  article-title: Gaze-sensing leds for head mounted displays
  publication-title: ArXiv Preprint
– ident: ref32
  doi: 10.1145/3173574.3173655
– ident: ref46
  doi: 10.1177/0278364917691115
– start-page: 3143
  year: 0
  ident: ref66
  article-title: Monocular free-head 3d gaze tracking with deep learning and geometry constraints
  publication-title: Proceedings of the IEEE International Conference on Computer Vision
– ident: ref45
  doi: 10.1016/j.cviu.2004.07.010
– ident: ref35
  doi: 10.1177/0278364914554813
– year: 0
  ident: ref2
– year: 0
  ident: ref1
– year: 0
  ident: ref3
– ident: ref12
  doi: 10.1109/72.217194
– ident: ref28
  doi: 10.1145/3290605.3300780
– ident: ref11
  doi: 10.1109/TPAMI.2007.1049
– ident: ref18
  doi: 10.1007/978-3-319-23192-1_4
– ident: ref65
  doi: 10.1109/TPAMI.2017.2778103
– start-page: 753
  year: 1994
  ident: ref6
  article-title: Non-intrusive gaze tracking using artificial neural networks
  publication-title: Advances in neural information processing systems
– ident: ref42
  doi: 10.1109/ISCAS.2018.8351411
– ident: ref21
  doi: 10.1007/978-3-030-01258-8_46
– ident: ref40
  doi: 10.1109/ISCAS.2016.7539103
– ident: ref44
  doi: 10.1109/IROS.2018.8593805
– ident: ref19
  doi: 10.1145/2857491.2857505
– ident: ref47
  doi: 10.1177/0278364917691115
– year: 2020
  ident: ref25
  article-title: Datasheet
  publication-title: DAVIS346 Sensor maximal value reported in the datasheet
– ident: ref60
  doi: 10.1109/ICCV.2017.114
– year: 2007
  ident: ref10
  publication-title: Eye tracking using event-based silicon retina
SSID ssj0014489
Score 2.6164494
Snippet The cameras in modern gaze-tracking systems suffer from fundamental bandwidth and power limitations, constraining data acquisition speed to 300 Hz...
SourceID swepub
proquest
pubmed
crossref
ieee
SourceType Open Access Repository
Aggregation Database
Index Database
Enrichment Source
Publisher
StartPage 2577
SubjectTerms Augmented and virtual reality
Augmented reality
Bandwidth
Cameras
Event-based camera
Eye movements
Eye tracking
Gaze tracking
Polynomials
Pupils
Real-time systems
Rendering
Rendering (computer graphics)
Tracking
Tracking systems
Two dimensional models
Virtual reality
Title Event-Based Near-Eye Gaze Tracking Beyond 10,000 Hz
URI https://ieeexplore.ieee.org/document/9389490
https://www.ncbi.nlm.nih.gov/pubmed/33780340
https://www.proquest.com/docview/2515738418
https://www.proquest.com/docview/2507148158
https://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-298669
Volume 27
WOSCitedRecordID wos000661120200008&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVIEE
  databaseName: IEEE Electronic Library (IEL)
  customDbUrl:
  eissn: 1941-0506
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0014489
  issn: 1941-0506
  databaseCode: RIE
  dateStart: 19950101
  isFulltext: true
  titleUrlDefault: https://ieeexplore.ieee.org/
  providerName: IEEE
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlR1Nb9UwzBoTBzgwYMC6jalIiANat6ROm-Q4xhs7PXEY07tFaZKKCfTetL2HBL9-dttXTWhC4lYpbpXaTvxtA7yv2OLC2BbRkLmqUsTClE0qUqu1sTIkWuyGTejp1Mxm9usGHI61MCmlLvksHfFjF8uPi7BiV9mxJemqLBnoj7TWfa3WGDEgM8P2-YW6KElLHyKYUtjji8vTL2QJlvIIu35pPIsHURuB7PK4J466-SoPqZp_9RHtZM_Z1v_t-jk8G3TM_KRnihewkeYv4em9zoPbgBPOcyw-kQyL-ZS4vZj8Tjln_uQkvQL7z_O-uCWX4pAutPz8zyv4dja5OD0vhvkJRUCjl4R-ZdoQUai2lqkKXpTWKxEjovehrqrYaBti43UUTWysklEJj6Eug_Eq1PgaNueLedqB3Mug6U5tPJpKhUY3MrYctMU6oDVSZyDWaHRhaC7OMy5-us7IENYxERwTwQ1EyODj-Mp131njX8DbjNgRcMBpBvtrWrnh7N060tgqjUZJk8G7cZlODYdC_DwtVgzDhVtGVgTzpqfx-O01a2TwoSf6uMKtuD9fXZ44oq_7sfzuSmvq2u4-vLk9eMK_0GdH7sPm8maV3sLj8Gt5dXtzQDw8MwcdD98BhRXniQ
linkProvider IEEE
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3dTxQxEJ8QMBEeQEVxBXVNjA-GhXbb3baPgIdnxAsPJ-Gt6bbdSCB3Bu5M9K9nZndvQwgx4W2Tzm66_U07M50vgI8FWVwi1FnQaK7KGESm8ypmsVZKG-4jDjbNJtRopM_PzekS7Pa5MDHGJvgs7tFj48sPUz-nq7J9g9JVGjTQVwopc95ma_U-AzQ0TBthqLIc9fTOh8mZ2R-fHX1FWzDne6KpmEbdeIRQmgm69LgjkJoOKw8pm_cqiTbS53jjcfN-BuudlpketGzxHJbi5AWs3ak9uAliQJGO2SFKsZCOkN-zwd-YUuxPivLL0w162qa3pJzt4pGWDv-9hJ_Hg_HRMOs6KGReaDVDAKSufRBM1iWPhXcsN06yEIRwzpdFESplfKicCqwKlZE8SOaEL3OvnfSleAXLk-kkvobUca_wVK2c0IX0lap4qMltK0ovjOYqAbZYRuu78uLU5eLKNmYGM5ZAsASC7UBI4HP_yu-2tsb_iDdpYXvCbk0T2FlgZbvdd2NRZyuU0JLrBD70w7hvyBniJnE6JxpK3dK8QJqtFuP-2wvWSOBTC3o_QsW4v1ycHVjE117Oftnc6LI0bx6e3Ht4Ohz_OLEn30bft2GVfqeNldyB5dn1PL6FJ_7P7OLm-l3DybdFNeno
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Event-Based+Near-Eye+Gaze+Tracking+Beyond+10%2C000+Hz&rft.jtitle=IEEE+transactions+on+visualization+and+computer+graphics&rft.au=Angelopoulos%2C+Anastasios+N&rft.au=Martel%2C+Julien+N+P&rft.au=Kohli%2C+Amit+P&rft.au=Conradt%2C+Jorg&rft.date=2021-05-01&rft.eissn=1941-0506&rft.volume=27&rft.issue=5&rft.spage=2577&rft_id=info:doi/10.1109%2FTVCG.2021.3067784&rft_id=info%3Apmid%2F33780340&rft.externalDocID=33780340
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1077-2626&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1077-2626&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1077-2626&client=summon