Classification of Mental Tasks Using Fixed and Adaptive Autoregressive Models of EEG Signals

Classification of EEG signals extracted during mental tasks is a technique for designing brain computer interfaces (BCI). In this paper, we classify EEG signals that were extracted during mental tasks using fixed autoregressive (FAR) and adaptive AR (AAR) models. Five different mental tasks from 4 s...

Full description

Saved in:
Bibliographic Details
Published in:International IEEE/EMBS Conference on Neural Engineering (Online) pp. 633 - 636
Main Authors: Nai-Jen Huan, Palaniappan, R.
Format: Conference Proceeding
Language:English
Published: IEEE 2005
Subjects:
ISBN:9780780387102, 0780387104
ISSN:1948-3546
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Classification of EEG signals extracted during mental tasks is a technique for designing brain computer interfaces (BCI). In this paper, we classify EEG signals that were extracted during mental tasks using fixed autoregressive (FAR) and adaptive AR (AAR) models. Five different mental tasks from 4 subjects were used in the experimental study and combinations of 2 different mental tasks are studied for each subject. Four different feature extraction methods were used to extract features from these EEG signals: FAR coefficients computed with Burg's algorithm using 125 data points, without segmentation and with segmentation of 25 data points, AAR coefficients computed with least-mean-square (LMS) algorithm using 125 data points, without segmentation and with segmentation of 25 data points. Multilayer perceptron (MLP) neural network (NN) trained by the backpropagation (BP) algorithm is used to classify these features into the different categories representing the mental tasks. The best results for FAR was 92.70% while for AAR was only 81.80%. The results obtained here indicated that FAR using 125 data points without segmentation gave better classification performance as compared to AAR, with all other parameters constant
ISBN:9780780387102
0780387104
ISSN:1948-3546
DOI:10.1109/CNE.2005.1419704