Please use this identifier to cite or link to this item: https://hdl.handle.net/20.500.14365/5216
Full metadata record
DC FieldValueLanguage
dc.contributor.authorAkbuğday, Burak-
dc.contributor.authorAkbugday, S.P.-
dc.contributor.authorSadikzade, R.-
dc.contributor.authorAkan, A.-
dc.contributor.authorUnal, S.-
dc.date.accessioned2024-03-30T11:20:56Z-
dc.date.available2024-03-30T11:20:56Z-
dc.date.issued2024-
dc.identifier.issn2619-9831-
dc.identifier.urihttps://doi.org/10.5152/electrica.2024.23111-
dc.identifier.urihttps://hdl.handle.net/20.500.14365/5216-
dc.description.abstractThe investigation of olfactory stimuli has become more prominent in the context of neuromarketing research over the last couple of years. Although a few studies suggest that olfactory stimuli are linked with consumer behavior and can be observed in various ways, such as via electroencephalogram (EEG), a universal method for the detection of olfactory stimuli has not been established yet. In this study, 14-channel EEG signals acquired from participants while they were presented with 2 identical boxes, scented and unscented, were processed to extract several linear and nonlinear features. Two approaches are presented for the classification of scented and unscented cases: i) using machine learning (ML) methods utilizing extracted features; ii) using deep learning (DL) methods utilizing relative sub-band power topographic heat map images. Experimental results suggest that the olfactory stimulus can be successfully detected with up to 92% accuracy by the proposed method. Furthermore, it is shown that topographic heat maps can accurately depict the response of the brain to olfactory stimuli. © 2024 Istanbul University. All rights reserved.en_US
dc.description.sponsorshipBAP2022-07en_US
dc.language.isoenen_US
dc.publisherIstanbul Universityen_US
dc.relation.ispartofElectricaen_US
dc.rightsinfo:eu-repo/semantics/closedAccessen_US
dc.subjectDeep Learningen_US
dc.subjectelectroencephalogram (EEG)en_US
dc.subjectmachine learningen_US
dc.subjectneuro-marketingen_US
dc.subjectolfactory stimulusen_US
dc.subjectConsumer behavioren_US
dc.subjectDeep learningen_US
dc.subjectLearning systemsen_US
dc.subjectDeep learningen_US
dc.subjectElectroencephalogramen_US
dc.subjectElectroencephalogram signalsen_US
dc.subjectHeat mapsen_US
dc.subjectLearning methodsen_US
dc.subjectMachine-learningen_US
dc.subjectNeuro-marketingen_US
dc.subjectNeuromarketingen_US
dc.subjectOlfactory stimulusen_US
dc.subjectUniversal methoden_US
dc.subjectElectroencephalographyen_US
dc.titleDetection of Olfactory Stimulus in Electroencephalogram Signals Using Machine and Deep Learning Methodsen_US
dc.typeArticleen_US
dc.identifier.doi10.5152/electrica.2024.23111-
dc.identifier.scopus2-s2.0-85185530677en_US
dc.departmentİzmir Ekonomi Üniversitesien_US
dc.authorscopusid57211987353-
dc.authorscopusid58821521400-
dc.authorscopusid58821594100-
dc.authorscopusid35617283100-
dc.authorscopusid43462177900-
dc.identifier.volume24en_US
dc.identifier.issue1en_US
dc.identifier.startpage175en_US
dc.identifier.endpage182en_US
dc.identifier.wosWOS:001275870300016en_US
dc.institutionauthor-
dc.relation.publicationcategoryMakale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanıen_US
dc.identifier.trdizinid1253415en_US
item.openairetypeArticle-
item.grantfulltextreserved-
item.cerifentitytypePublications-
item.openairecristypehttp://purl.org/coar/resource_type/c_18cf-
item.languageiso639-1en-
item.fulltextWith Fulltext-
crisitem.author.dept05.06. Electrical and Electronics Engineering-
Appears in Collections:Scopus İndeksli Yayınlar Koleksiyonu / Scopus Indexed Publications Collection
TR Dizin İndeksli Yayınlar Koleksiyonu / TR Dizin Indexed Publications Collection
WoS İndeksli Yayınlar Koleksiyonu / WoS Indexed Publications Collection
Files in This Item:
File SizeFormat 
5216.pdf
  Restricted Access
2.81 MBAdobe PDFView/Open    Request a copy
Show simple item record



CORE Recommender

Page view(s)

64
checked on Aug 19, 2024

Google ScholarTM

Check




Altmetric


Items in GCRIS Repository are protected by copyright, with all rights reserved, unless otherwise indicated.