Please use this identifier to cite or link to this item: http://cmuir.cmu.ac.th/jspui/handle/6653943832/58407
Title: Imaginary hand movement classification using electroencephalography
Authors: Pornwitcha Somsap
Nipon Theera-Umpon
Sansanee Auephanwiriyakul
Authors: Pornwitcha Somsap
Nipon Theera-Umpon
Sansanee Auephanwiriyakul
Keywords: Chemical Engineering;Computer Science;Engineering;Mathematics
Issue Date: 7-Feb-2018
Abstract: © 2017 IEEE. This paper proposes a method to help patients who cannot control their appendicular organs to communicate and to control devices via a binary decision by electroencephalography (EEG). We exploited 12 volunteers' EEG datasets from PhysioNet (EEG Motor Movement/Imaginary Datasets) that contain imaginary hand movement. For the signal selection, we have selected theta and alpha bands (4-15 Hz), since the signals in these bands are distinctively changed by the imagination. For the method, we have applied power spectrum density estimated by the autoregressive model (AR-model) to extract features, and then used principal component analysis (PCA) in order to reduce those features before the classification step. To measure the quality of the derived features, we used a set of classifiers including the decision tree, K-nearest neighborhood, and ensemble classifier. For the experiment, we conducted both intra-user and inter-user approaches. The leave-one-out cross validation was applied in the intra-user experiment while the five-fold cross validation was applied in the inter-user experiment. The results show that the highest average of classification accuracy is achieved by the cubic K-NN (97.08%) in the inter-user experiment and by the weighted K-NN (91.88%) in intra-user experiment.
URI: https://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=85050368521&origin=inward
http://cmuir.cmu.ac.th/jspui/handle/6653943832/58407
Appears in Collections:CMUL: Journal Articles

Files in This Item:
There are no files associated with this item.


Items in CMUIR are protected by copyright, with all rights reserved, unless otherwise indicated.