BENDR: Using Transformers and a Contrastive Self-Supervised Learning Task to Learn From Massive Amounts of EEG Data

Deep neural networks (DNNs) used for brain–computer interface (BCI) classification are commonly expected to learn general features when trained across a variety of contexts, such that these features could be fine-tuned to specific contexts. While some success is found in such an approach, we suggest...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Frontiers in human neuroscience Ročník 15; s. 653659
Hlavní autoři: Kostas, Demetres, Aroca-Ouellette, Stéphane, Rudzicz, Frank
Médium: Journal Article
Jazyk:angličtina
Vydáno: Frontiers Media S.A 23.06.2021
Témata:
ISSN:1662-5161, 1662-5161
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Deep neural networks (DNNs) used for brain–computer interface (BCI) classification are commonly expected to learn general features when trained across a variety of contexts, such that these features could be fine-tuned to specific contexts. While some success is found in such an approach, we suggest that this interpretation is limited and an alternative would better leverage the newly (publicly) available massive electroencephalography (EEG) datasets. We consider how to adapt techniques and architectures used for language modeling (LM) that appear capable of ingesting awesome amounts of data toward the development of encephalography modeling with DNNs in the same vein. We specifically adapt an approach effectively used for automatic speech recognition, which similarly (to LMs) uses a self-supervised training objective to learn compressed representations of raw data signals. After adaptation to EEG, we find that a single pre-trained model is capable of modeling completely novel raw EEG sequences recorded with differing hardware, and different subjects performing different tasks. Furthermore, both the internal representations of this model and the entire architecture can be fine-tuned to a variety of downstream BCI and EEG classification tasks, outperforming prior work in more task-specific (sleep stage classification) self-supervision.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
This article was submitted to Brain-Computer Interfaces, a section of the journal Frontiers in Human Neuroscience
Edited by: Sung Chan Jun, Gwangju Institute of Science and Technology, South Korea
Reviewed by: Dalin Zhang, Aalborg University, Denmark; Tomasz Maciej Rutkowski, RIKEN Center for Advanced Intelligence Project (AIP), Japan
ISSN:1662-5161
1662-5161
DOI:10.3389/fnhum.2021.653659