Please use this identifier to cite or link to this item: https://hdl.handle.net/20.500.14365/5870
Title: Decoding Olfactory EEG Signals Using Multi-Domain Features and Machine Learning
Authors: Akbugday, Sude Pehlivan
Akbugday, Burak
Yeganli, Faezeh
Akan, Aydin
Sadikzade, Riza
Keywords: Olfactory Eeg
Eeg
Machine Learning
Emotion Recognition
Arousal-Valence Plane
Neuromarketing
Publisher: IEEE
Series/Report no.: Medical Technologies National Conference
Abstract: Accurate detection of human emotion is an important topic for affective computing. Especially with the rise of artificial intelligence in the marketing industry, the tools available are subjective and often heavily dependent on sample sizes and demographics. This study explores the neural responses to olfactory stimuli by analyzing EEG data collected from 57 participants exposed to a perfume scent in correlation with self-reported survey results. The electroencephalogram (EEG) signals were processed to extract time-domain, spectral-domain, and nonlinear features, which were subsequently classified using various machine learning algorithms. The classification outcomes were mapped onto a two-dimensional pleasure-arousal plane, with the Medium Gaussian support vector machine (SVM) achieving the highest performance, including 99.8 % validation accuracy and 100 % test accuracy. These results highlight the significant potential of EEG-based approaches in decoding the neural underpinnings of sensory experiences, with implications for applications in neuromarketing and therapeutic contexts.
URI: https://doi.org/10.1109/TIPTEKNO63488.2024.10755366
ISBN: 9798331529819
9798331529826
ISSN: 2687-7775
Appears in Collections:Scopus İndeksli Yayınlar Koleksiyonu / Scopus Indexed Publications Collection
WoS İndeksli Yayınlar Koleksiyonu / WoS Indexed Publications Collection

Show full item record



CORE Recommender

SCOPUSTM   
Citations

1
checked on Jul 9, 2025

Page view(s)

80
checked on Jul 7, 2025

Google ScholarTM

Check




Altmetric


Items in GCRIS Repository are protected by copyright, with all rights reserved, unless otherwise indicated.