Please use this identifier to cite or link to this item: https://hdl.handle.net/20.500.14365/1346
Full metadata record
DC FieldValueLanguage
dc.contributor.authorKiranyaz, Serkan-
dc.contributor.authorİnce, Türker-
dc.contributor.authorIosifidis, Alexandros-
dc.contributor.authorGabbouj, Moncef-
dc.date.accessioned2023-06-16T14:11:18Z-
dc.date.available2023-06-16T14:11:18Z-
dc.date.issued2017-
dc.identifier.issn0925-2312-
dc.identifier.issn1872-8286-
dc.identifier.urihttps://doi.org/10.1016/j.neucom.2016.10.044-
dc.identifier.urihttps://hdl.handle.net/20.500.14365/1346-
dc.description.abstractThere are well-known limitations and drawbacks on the performance and robustness of the feed-forward, fully connected Artificial Neural Networks (ANNs), or the so-called Multi-Layer Perceptrons (MLPs). In this study we shall address them by Generalized Operational Perceptrons (GOPs) that consist of neurons with distinct (non-) linear operators to achieve a generalized model of the biological neurons and ultimately a superior diversity. We modified the conventional back-propagation (BP) to train GOPs and furthermore, proposed Progressive Operational Perceptrons (POPs) to achieve self-organized and depth-adaptive GOPs according to the learning problem. The most crucial property of the POPs is their ability to simultaneously search for the optimal operator set and train each layer individually. The final POP is, therefore, formed layer by layer and in this paper we shall show that this ability enables POPs with minimal network depth to attack the most challenging learning problems that cannot be learned by conventional ANNs even with a deeper and significantly complex configuration. Experimental results show that POPs can scale up very well with the problem size and can have the potential to achieve a superior generalization performance on real benchmark problems with a significant gain.en_US
dc.language.isoenen_US
dc.publisherElsevieren_US
dc.relation.ispartofNeurocomputıngen_US
dc.rightsinfo:eu-repo/semantics/closedAccessen_US
dc.subjectArtificial neural networksen_US
dc.subjectMulti-layer perceptronsen_US
dc.subjectProgressive operational perceptronsen_US
dc.subjectDiversityen_US
dc.subjectScalabilityen_US
dc.subjectNetworken_US
dc.titleProgressive Operational Perceptronsen_US
dc.typeArticleen_US
dc.identifier.doi10.1016/j.neucom.2016.10.044-
dc.identifier.scopus2-s2.0-85006339727en_US
dc.departmentİzmir Ekonomi Üniversitesien_US
dc.authoridGabbouj, Moncef/0000-0002-9788-2323-
dc.authoridIosifidis, Alexandros/0000-0003-4807-1345-
dc.authoridİnce, Türker/0000-0002-8495-8958-
dc.authoridkiranyaz, serkan/0000-0003-1551-3397-
dc.authorwosidGabbouj, Moncef/G-4293-2014-
dc.authorwosidKiranyaz, Serkan/AAK-1416-2021-
dc.authorwosidIosifidis, Alexandros/G-2433-2013-
dc.authorscopusid7801632948-
dc.authorscopusid56259806600-
dc.authorscopusid36720841400-
dc.authorscopusid7005332419-
dc.identifier.volume224en_US
dc.identifier.startpage142en_US
dc.identifier.endpage154en_US
dc.identifier.wosWOS:000392355600014en_US
dc.relation.publicationcategoryMakale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanıen_US
dc.identifier.scopusqualityQ1-
item.openairetypeArticle-
item.grantfulltextreserved-
item.cerifentitytypePublications-
item.openairecristypehttp://purl.org/coar/resource_type/c_18cf-
item.languageiso639-1en-
item.fulltextWith Fulltext-
crisitem.author.dept05.06. Electrical and Electronics Engineering-
Appears in Collections:Scopus İndeksli Yayınlar Koleksiyonu / Scopus Indexed Publications Collection
WoS İndeksli Yayınlar Koleksiyonu / WoS Indexed Publications Collection
Files in This Item:
File SizeFormat 
383.pdf
  Restricted Access
2.02 MBAdobe PDFView/Open    Request a copy
Show simple item record



CORE Recommender

SCOPUSTM   
Citations

36
checked on Sep 18, 2024

WEB OF SCIENCETM
Citations

33
checked on Sep 18, 2024

Page view(s)

66
checked on Aug 19, 2024

Download(s)

2
checked on Aug 19, 2024

Google ScholarTM

Check




Altmetric


Items in GCRIS Repository are protected by copyright, with all rights reserved, unless otherwise indicated.