Please use this identifier to cite or link to this item: https://hdl.handle.net/20.500.14365/2126
Full metadata record
DC FieldValueLanguage
dc.contributor.authorCura, Ozlem Karabiber-
dc.contributor.authorAkan, Aydin-
dc.date.accessioned2023-06-16T14:31:30Z-
dc.date.available2023-06-16T14:31:30Z-
dc.date.issued2021-
dc.identifier.issn0129-0657-
dc.identifier.issn1793-6462-
dc.identifier.urihttps://doi.org/10.1142/S0129065721500052-
dc.identifier.urihttps://hdl.handle.net/20.500.14365/2126-
dc.description.abstractEpilepsy is a neurological disease that is very common worldwide. Patient's electroencephalography (EEG) signals are frequently used for the detection of epileptic seizure segments. In this paper, a high-resolution time-frequency (TF) representation called Synchrosqueezing Transform (SST) is used to detect epileptic seizures. Two different EEG data sets, the IKCU data set we collected, and the publicly available CHB-MIT data set are analyzed to test the performance of the proposed model in seizure detection. The SST representations of seizure and nonseizure (pre-seizure or inter-seizure) EEG segments of epilepsy patients are calculated. Various features like higher-order joint TF (HOJ-TF) moments and gray-level co-occurrence matrix (GLCM)-based features are calculated using the SST representation. By using single and ensemble machine learning methods such as k-Nearest Neighbor (kNN), Logistic Regression (LR), Naive Bayes (NB), Support Vector Machine (SVM), Boosted Trees (BT), and Subspace kNN (S-kNN), EEG features are classified. The proposed SST-based approach achieved 95.1% ACC, 96.87% PRE, 95.54% REC values for the IKCU data set, and 95.13% ACC, 93.37% PRE, 90.30% REC values for the CHB-MIT data set in seizure detection. Results show that the proposed SST-based method utilizing novel TF features outperforms the short-time Fourier transform (STFT)-based approach, providing over 95% accuracy for most cases, and compares well with the existing methods.en_US
dc.language.isoenen_US
dc.publisherWorld Scientific Publ Co Pte Ltden_US
dc.relation.ispartofInternatıonal Journal of Neural Systemsen_US
dc.rightsinfo:eu-repo/semantics/closedAccessen_US
dc.subjectSynchrosqueezing transform (SST)en_US
dc.subjectelectroencephalogram (EEG)en_US
dc.subjectepileptic seizure classificationen_US
dc.subjectmachine learningen_US
dc.subjectTime-Frequency Analysisen_US
dc.subjectWavelet Transformen_US
dc.subjectSeizure Detectionen_US
dc.subjectNeural-Networken_US
dc.subjectMethodologyen_US
dc.subjectImageen_US
dc.subjectJointen_US
dc.titleClassification of Epileptic EEG Signals Using Synchrosqueezing Transform and Machine Learningen_US
dc.typeArticleen_US
dc.identifier.doi10.1142/S0129065721500052-
dc.identifier.pmid33522458en_US
dc.identifier.scopus2-s2.0-85100590203en_US
dc.departmentİzmir Ekonomi Üniversitesien_US
dc.authorscopusid57195223021-
dc.authorscopusid35617283100-
dc.identifier.volume31en_US
dc.identifier.issue5en_US
dc.identifier.wosWOS:000637815000005en_US
dc.relation.publicationcategoryMakale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanıen_US
dc.identifier.scopusqualityQ1-
item.grantfulltextreserved-
item.openairetypeArticle-
item.languageiso639-1en-
item.fulltextWith Fulltext-
item.openairecristypehttp://purl.org/coar/resource_type/c_18cf-
item.cerifentitytypePublications-
crisitem.author.dept05.06. Electrical and Electronics Engineering-
Appears in Collections:PubMed İndeksli Yayınlar Koleksiyonu / PubMed Indexed Publications Collection
Scopus İndeksli Yayınlar Koleksiyonu / Scopus Indexed Publications Collection
WoS İndeksli Yayınlar Koleksiyonu / WoS Indexed Publications Collection
Files in This Item:
File SizeFormat 
S012906572150026X.pdf
  Restricted Access
3.13 MBAdobe PDFView/Open    Request a copy
Show simple item record



CORE Recommender

SCOPUSTM   
Citations

24
checked on Sep 18, 2024

WEB OF SCIENCETM
Citations

24
checked on Sep 18, 2024

Page view(s)

98
checked on Aug 19, 2024

Download(s)

2
checked on Aug 19, 2024

Google ScholarTM

Check




Altmetric


Items in GCRIS Repository are protected by copyright, with all rights reserved, unless otherwise indicated.