Please use this identifier to cite or link to this item: http://cmuir.cmu.ac.th/jspui/handle/6653943832/67808
Full metadata record
DC FieldValueLanguage
dc.contributor.authorPayongkit Lakhanen_US
dc.contributor.authorNannapas Banluesombatkulen_US
dc.contributor.authorVongsagon Changniamen_US
dc.contributor.authorRatwade Dhithijaiyratnen_US
dc.contributor.authorPitshaporn Leelaarpornen_US
dc.contributor.authorEkkarat Boonchiengen_US
dc.contributor.authorSupanida Hompoonsupen_US
dc.contributor.authorTheerawit Wilaiprasitpornen_US
dc.date.accessioned2020-04-02T15:04:49Z-
dc.date.available2020-04-02T15:04:49Z-
dc.date.issued2019-11-01en_US
dc.identifier.issn15581748en_US
dc.identifier.issn1530437Xen_US
dc.identifier.other2-s2.0-85073212855en_US
dc.identifier.other10.1109/JSEN.2019.2928781en_US
dc.identifier.urihttps://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=85073212855&origin=inwarden_US
dc.identifier.urihttp://cmuir.cmu.ac.th/jspui/handle/6653943832/67808-
dc.description.abstract© 2001-2012 IEEE. For several decades, electroencephalography (EEG) has featured as one of the most commonly used tools in emotional state recognition via monitoring of distinctive brain activities. An array of datasets has been generated with the use of diverse emotion-eliciting stimuli and the resulting brainwave responses conventionally captured with high-end EEG devices. However, the applicability of these devices is to some extent limited by practical constraints and may prove difficult to be deployed in highly mobile context omnipresent in everyday happenings. In this study, we evaluate the potential of OpenBCI to bridge this gap by first comparing its performance to research grade EEG system, employing the same algorithms that were applied on benchmark datasets. Moreover, for the purpose of emotion classification, we propose a novel method to facilitate the selection of audio-visual stimuli of high/low valence and arousal. Our setup entailed recruiting 200 healthy volunteers of varying years of age to identify the top 60 affective video clips from a total of 120 candidates through standardized self assessment, genre tags, and unsupervised machine learning. In addition, 43 participants were enrolled to watch the pre-selected clips during which emotional EEG brainwaves and peripheral physiological signals were collected. These recordings were analyzed and extracted features fed into a classification model to predict whether the elicited signals were associated with a high or low level of valence and arousal. As it turned out, our prediction accuracies were decidedly comparable to those of previous studies that utilized more costly EEG amplifiers for data acquisition.en_US
dc.subjectEngineeringen_US
dc.subjectPhysics and Astronomyen_US
dc.titleConsumer grade brain sensing for emotion recognitionen_US
dc.typeJournalen_US
article.title.sourcetitleIEEE Sensors Journalen_US
article.volume19en_US
article.stream.affiliationsVidyasirimedhi Institute of Science and Technologyen_US
article.stream.affiliationsChulalongkorn Universityen_US
article.stream.affiliationsKing Mongkut s University of Technology Thonburien_US
article.stream.affiliationsChiang Mai Universityen_US
Appears in Collections:CMUL: Journal Articles

Files in This Item:
There are no files associated with this item.


Items in CMUIR are protected by copyright, with all rights reserved, unless otherwise indicated.