Progressive Operational Perceptrons

dc.contributor.author Kiranyaz, Serkan
dc.contributor.author İnce, Türker
dc.contributor.author Iosifidis, Alexandros
dc.contributor.author Gabbouj, Moncef
dc.date.accessioned 2023-06-16T14:11:18Z
dc.date.available 2023-06-16T14:11:18Z
dc.date.issued 2017
dc.description.abstract There are well-known limitations and drawbacks on the performance and robustness of the feed-forward, fully connected Artificial Neural Networks (ANNs), or the so-called Multi-Layer Perceptrons (MLPs). In this study we shall address them by Generalized Operational Perceptrons (GOPs) that consist of neurons with distinct (non-) linear operators to achieve a generalized model of the biological neurons and ultimately a superior diversity. We modified the conventional back-propagation (BP) to train GOPs and furthermore, proposed Progressive Operational Perceptrons (POPs) to achieve self-organized and depth-adaptive GOPs according to the learning problem. The most crucial property of the POPs is their ability to simultaneously search for the optimal operator set and train each layer individually. The final POP is, therefore, formed layer by layer and in this paper we shall show that this ability enables POPs with minimal network depth to attack the most challenging learning problems that cannot be learned by conventional ANNs even with a deeper and significantly complex configuration. Experimental results show that POPs can scale up very well with the problem size and can have the potential to achieve a superior generalization performance on real benchmark problems with a significant gain. en_US
dc.identifier.doi 10.1016/j.neucom.2016.10.044
dc.identifier.issn 0925-2312
dc.identifier.issn 1872-8286
dc.identifier.scopus 2-s2.0-85006339727
dc.identifier.uri https://doi.org/10.1016/j.neucom.2016.10.044
dc.identifier.uri https://hdl.handle.net/20.500.14365/1346
dc.language.iso en en_US
dc.publisher Elsevier en_US
dc.relation.ispartof Neurocomputıng en_US
dc.rights info:eu-repo/semantics/closedAccess en_US
dc.subject Artificial neural networks en_US
dc.subject Multi-layer perceptrons en_US
dc.subject Progressive operational perceptrons en_US
dc.subject Diversity en_US
dc.subject Scalability en_US
dc.subject Network en_US
dc.title Progressive Operational Perceptrons en_US
dc.type Article en_US
dspace.entity.type Publication
gdc.author.id Gabbouj, Moncef/0000-0002-9788-2323
gdc.author.id Iosifidis, Alexandros/0000-0003-4807-1345
gdc.author.id İnce, Türker/0000-0002-8495-8958
gdc.author.id kiranyaz, serkan/0000-0003-1551-3397
gdc.author.scopusid 7801632948
gdc.author.scopusid 56259806600
gdc.author.scopusid 36720841400
gdc.author.scopusid 7005332419
gdc.author.wosid Gabbouj, Moncef/G-4293-2014
gdc.author.wosid Kiranyaz, Serkan/AAK-1416-2021
gdc.author.wosid Iosifidis, Alexandros/G-2433-2013
gdc.bip.impulseclass C4
gdc.bip.influenceclass C4
gdc.bip.popularityclass C4
gdc.coar.access metadata only access
gdc.coar.type text::journal::journal article
gdc.collaboration.industrial false
gdc.description.department İzmir Ekonomi Üniversitesi en_US
gdc.description.departmenttemp [Kiranyaz, Serkan] Qatar Univ, Dept Elect Engn, Doha, Qatar; [İnce, Türker] Izmir Univ Econ, Dept Elect Engn, Izmir, Turkey; [Iosifidis, Alexandros; Gabbouj, Moncef] Tampere Univ Technol, Dept Signal Proc, Tampere, Finland en_US
gdc.description.endpage 154 en_US
gdc.description.publicationcategory Makale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanı en_US
gdc.description.scopusquality Q1
gdc.description.startpage 142 en_US
gdc.description.volume 224 en_US
gdc.description.wosquality Q1
gdc.identifier.openalex W2546712344
gdc.identifier.wos WOS:000392355600014
gdc.index.type WoS
gdc.index.type Scopus
gdc.oaire.diamondjournal false
gdc.oaire.impulse 10.0
gdc.oaire.influence 5.866501E-9
gdc.oaire.isgreen true
gdc.oaire.keywords Optimal operators
gdc.oaire.keywords Complex networks
gdc.oaire.keywords Multi-layer perceptrons (MLPs)
gdc.oaire.keywords Complex configuration
gdc.oaire.keywords Backpropagation
gdc.oaire.keywords Article
gdc.oaire.keywords mathematical analysis
gdc.oaire.keywords back propagation
gdc.oaire.keywords learning disorder
gdc.oaire.keywords perceptron
gdc.oaire.keywords Generalized models
gdc.oaire.keywords linear system
gdc.oaire.keywords Mathematical operators
gdc.oaire.keywords mathematical computing
gdc.oaire.keywords Bench-mark problems
gdc.oaire.keywords Generalization performance
gdc.oaire.keywords Diversity
gdc.oaire.keywords generalized operational perceptron
gdc.oaire.keywords statistical model
gdc.oaire.keywords Scalability
gdc.oaire.keywords mathematical parameters
gdc.oaire.keywords Benchmarking
gdc.oaire.keywords priority journal
gdc.oaire.keywords progressive operational perceptron
gdc.oaire.keywords nerve cell
gdc.oaire.keywords Cybernetics
gdc.oaire.keywords Neural networks
gdc.oaire.keywords artificial neural network
gdc.oaire.keywords Multi-layer perceptrons
gdc.oaire.popularity 2.966175E-8
gdc.oaire.publicfunded false
gdc.oaire.sciencefields 02 engineering and technology
gdc.oaire.sciencefields 0202 electrical engineering, electronic engineering, information engineering
gdc.openalex.collaboration International
gdc.openalex.fwci 3.991
gdc.openalex.normalizedpercentile 0.95
gdc.openalex.toppercent TOP 10%
gdc.opencitations.count 37
gdc.plumx.crossrefcites 7
gdc.plumx.mendeley 38
gdc.plumx.scopuscites 45
gdc.scopus.citedcount 45
gdc.virtual.author İnce, Türker
gdc.wos.citedcount 41
relation.isAuthorOfPublication 620fe4b0-bfe7-4e8f-8157-31e93f36a89b
relation.isAuthorOfPublication.latestForDiscovery 620fe4b0-bfe7-4e8f-8157-31e93f36a89b
relation.isOrgUnitOfPublication b02722f0-7082-4d8a-8189-31f0230f0e2f
relation.isOrgUnitOfPublication 26a7372c-1a5e-42d9-90b6-a3f7d14cad44
relation.isOrgUnitOfPublication e9e77e3e-bc94-40a7-9b24-b807b2cd0319
relation.isOrgUnitOfPublication.latestForDiscovery b02722f0-7082-4d8a-8189-31f0230f0e2f

Files

Original bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
383.pdf
Size:
1.98 MB
Format:
Adobe Portable Document Format