WoS İndeksli Yayınlar Koleksiyonu / WoS Indexed Publications Collection
Permanent URI for this collectionhttps://hdl.handle.net/20.500.14365/5
Browse
Browsing WoS İndeksli Yayınlar Koleksiyonu / WoS Indexed Publications Collection by Author "Abdallah, Habib Ben"
Now showing 1 - 2 of 2
- Results Per Page
- Sort Options
Article Citation - WoS: 17Citation - Scopus: 18Exploiting Heterogeneity in Operational Neural Networks by Synaptic Plasticity(Springer London Ltd, 2021) Kiranyaz, Serkan; Malik, Junaid; Abdallah, Habib Ben; İnce, Türker; Iosifidis, Alexandros; Gabbouj, MoncefThe recently proposed network model, Operational Neural Networks (ONNs), can generalize the conventional Convolutional Neural Networks (CNNs) that are homogenous only with a linear neuron model. As a heterogenous network model, ONNs are based on a generalized neuron model that can encapsulate any set of non-linear operators to boost diversity and to learn highly complex and multi-modal functions or spaces with minimal network complexity and training data. However, the default search method to find optimal operators in ONNs, the so-called Greedy Iterative Search (GIS) method, usually takes several training sessions to find a single operator set per layer. This is not only computationally demanding, also the network heterogeneity is limited since the same set of operators will then be used for all neurons in each layer. To address this deficiency and exploit a superior level of heterogeneity, in this study the focus is drawn on searching the best-possible operator set(s) for the hidden neurons of the network based on the Synaptic Plasticity paradigm that poses the essential learning theory in biological neurons. During training, each operator set in the library can be evaluated by their synaptic plasticity level, ranked from the worst to the best, and an elite ONN can then be configured using the top-ranked operator sets found at each hidden layer. Experimental results over highly challenging problems demonstrate that the elite ONNs even with few neurons and layers can achieve a superior learning performance than GIS-based ONNs and as a result, the performance gap over the CNNs further widens.Article Citation - WoS: 64Citation - Scopus: 76Self-Organized Operational Neural Networks With Generative Neurons(Pergamon-Elsevier Science Ltd, 2021) Kiranyaz, Serkan; Malik, Junaid; Abdallah, Habib Ben; İnce, Türker; Iosifidis, Alexandros; Gabbouj, MoncefOperational Neural Networks (ONNs) have recently been proposed to address the well-known limitations and drawbacks of conventional Convolutional Neural Networks (CNNs) such as network homogeneity with the sole linear neuron model. ONNs are heterogeneous networks with a generalized neuron model. However the operator search method in ONNs is not only computationally demanding, but the network heterogeneity is also limited since the same set of operators will then be used for all neurons in each layer. Moreover, the performance of ONNs directly depends on the operator set library used, which introduces a certain risk of performance degradation especially when the optimal operator set required for a particular task is missing from the library. In order to address these issues and achieve an ultimate heterogeneity level to boost the network diversity along with computational efficiency, in this study we propose Self-organized ONNs (Self-ONNs) with generative neurons that can adapt (optimize) the nodal operator of each connection during the training process. Moreover, this ability voids the need of having a fixed operator set library and the prior operator search within the library in order to find the best possible set of operators. We further formulate the training method to back-propagate the error through the operational layers of Self-ONNs. Experimental results over four challenging problems demonstrate the superior learning capability and computational efficiency of Self-ONNs over conventional ONNs and CNNs. (C) 2021 The Author(s). Published by Elsevier Ltd.
