Please use this identifier to cite or link to this item: https://hdl.handle.net/20.500.14365/2315
Full metadata record
DC FieldValueLanguage
dc.contributor.authorOzdemir, Mehmet Akif-
dc.contributor.authorDegirmenci, Murside-
dc.contributor.authorIzci, Elf-
dc.contributor.authorAkan, Aydin-
dc.date.accessioned2023-06-16T14:38:47Z-
dc.date.available2023-06-16T14:38:47Z-
dc.date.issued2021-
dc.identifier.issn0013-5585-
dc.identifier.issn1862-278X-
dc.identifier.urihttps://doi.org/10.1515/bmt-2019-0306-
dc.identifier.urihttps://hdl.handle.net/20.500.14365/2315-
dc.description.abstractThe emotional state of people plays a key role in physiological and behavioral human interaction. Emotional state analysis entails many fields such as neuroscience, cognitive sciences, and biomedical engineering because the parameters of interest contain the complex neuronal activities of the brain. Electroencephalogram (EEG) signals are processed to communicate brain signals with external systems and make predictions over emotional states. This paper proposes a novel method for emotion recognition based on deep convolutional neural networks (CNNs) that are used to classify Valence, Arousal, Dominance, and Liking emotional states. Hence, a novel approach is proposed for emotion recognition with time series of multi-channel EEG signals from a Database for Emotion Analysis and Using Physiological Signals (DEAP). We propose a new approach to emotional state estimation utilizing CNN-based classification of multi-spectral topology images obtained from EEG signals. In contrast to most of the EEG-based approaches that eliminate spatial information of EEG signals, converting EEG signals into a sequence of multi-spectral topology images, temporal, spectral, and spatial information of EEG signals are preserved. The deep recurrent convolutional network is trained to learn important representations from a sequence of three-channel topographical images. We have achieved test accuracy of 90.62% for negative and positive Valence, 86.13% for high and low Arousal, 88.48% for high and low Dominance, and finally 86.23% for like-unlike. The evaluations of this method on emotion recognition problem revealed significant improvements in the classification accuracy when compared with other studies using deep neural networks (DNNs) and one-dimensional CNNs.en_US
dc.description.sponsorshipIzmir Katip Celebi University Scientific Research Projects Coordination Unit [2019-ONAP-MUMF-0001]en_US
dc.description.sponsorshipThis work was funded by Izmir Katip Celebi University Scientific Research Projects Coordination Unit (project number: 2019-ONAP-MUMF-0001). All funding is for equipment for the research project. There is no available funding for publication.en_US
dc.language.isoenen_US
dc.publisherWalter De Gruyter Gmbhen_US
dc.relation.ispartofBıomedıcal Engıneerıng-Bıomedızınısche Technıken_US
dc.rightsinfo:eu-repo/semantics/closedAccessen_US
dc.subjectazimuthal equidistant projection techniqueen_US
dc.subjectbrain mappingen_US
dc.subjectdeep learningen_US
dc.subjectEEG imagesen_US
dc.subjectelectroencephalogramen_US
dc.subjectemotion estimationen_US
dc.subjectClassificationen_US
dc.subjectSignalsen_US
dc.subjectModelsen_US
dc.titleEEG-based emotion recognition with deep convolutional neural networksen_US
dc.typeArticleen_US
dc.identifier.doi10.1515/bmt-2019-0306-
dc.identifier.pmid32845859en_US
dc.identifier.scopus2-s2.0-85091146916en_US
dc.departmentİzmir Ekonomi Üniversitesien_US
dc.authoridİzci, Elif/0000-0003-1148-8374-
dc.authoridOzdemir, Mehmet Akif/0000-0002-8758-113X-
dc.authorwosidİzci, Elif/GOE-6084-2022-
dc.authorwosidOzdemir, Mehmet Akif/G-7952-2018-
dc.authorscopusid57206479576-
dc.authorscopusid57206472130-
dc.authorscopusid57206467904-
dc.authorscopusid35617283100-
dc.identifier.volume66en_US
dc.identifier.issue1en_US
dc.identifier.startpage43en_US
dc.identifier.endpage57en_US
dc.identifier.wosWOS:000621777900005en_US
dc.relation.publicationcategoryMakale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanıen_US
dc.identifier.scopusqualityQ3-
dc.identifier.wosqualityQ4-
item.grantfulltextreserved-
item.openairetypeArticle-
item.openairecristypehttp://purl.org/coar/resource_type/c_18cf-
item.fulltextWith Fulltext-
item.languageiso639-1en-
item.cerifentitytypePublications-
crisitem.author.dept05.06. Electrical and Electronics Engineering-
Appears in Collections:PubMed İndeksli Yayınlar Koleksiyonu / PubMed Indexed Publications Collection
Scopus İndeksli Yayınlar Koleksiyonu / Scopus Indexed Publications Collection
WoS İndeksli Yayınlar Koleksiyonu / WoS Indexed Publications Collection
Files in This Item:
File SizeFormat 
2315.pdf
  Restricted Access
2.42 MBAdobe PDFView/Open    Request a copy
Show simple item record



CORE Recommender

SCOPUSTM   
Citations

61
checked on Nov 20, 2024

WEB OF SCIENCETM
Citations

52
checked on Nov 20, 2024

Page view(s)

80
checked on Nov 18, 2024

Download(s)

6
checked on Nov 18, 2024

Google ScholarTM

Check




Altmetric


Items in GCRIS Repository are protected by copyright, with all rights reserved, unless otherwise indicated.