Exploiting Heterogeneity in Operational Neural Networks by Synaptic Plasticity
Loading...
Files
Date
2021
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Springer London Ltd
Open Access Color
HYBRID
Green Open Access
Yes
OpenAIRE Downloads
OpenAIRE Views
Publicly Funded
No
Abstract
The recently proposed network model, Operational Neural Networks (ONNs), can generalize the conventional Convolutional Neural Networks (CNNs) that are homogenous only with a linear neuron model. As a heterogenous network model, ONNs are based on a generalized neuron model that can encapsulate any set of non-linear operators to boost diversity and to learn highly complex and multi-modal functions or spaces with minimal network complexity and training data. However, the default search method to find optimal operators in ONNs, the so-called Greedy Iterative Search (GIS) method, usually takes several training sessions to find a single operator set per layer. This is not only computationally demanding, also the network heterogeneity is limited since the same set of operators will then be used for all neurons in each layer. To address this deficiency and exploit a superior level of heterogeneity, in this study the focus is drawn on searching the best-possible operator set(s) for the hidden neurons of the network based on the Synaptic Plasticity paradigm that poses the essential learning theory in biological neurons. During training, each operator set in the library can be evaluated by their synaptic plasticity level, ranked from the worst to the best, and an elite ONN can then be configured using the top-ranked operator sets found at each hidden layer. Experimental results over highly challenging problems demonstrate that the elite ONNs even with few neurons and layers can achieve a superior learning performance than GIS-based ONNs and as a result, the performance gap over the CNNs further widens.
Description
Keywords
Operational neural networks, Convolutional neural networks, Synaptic Plasticity, Representations, FOS: Computer and information sciences, Computer Science - Machine Learning, Iterative methods, Biological neuron, Complex networks, Machine Learning (stat.ML), Multi modal function, 530, 113, Heterogenous network, Synaptic plasticity, Machine Learning (cs.LG), Statistics - Machine Learning, Generalized neuron, Network heterogeneity, Neural and Evolutionary Computing (cs.NE), Mathematical operators, Training sessions, Neurons, Learning systems, Learning performance, Computer Science - Neural and Evolutionary Computing, 113 Computer and information sciences, 004, Convolutional neural networks, Heterogeneous networks, Personnel training
Fields of Science
0301 basic medicine, 02 engineering and technology, 03 medical and health sciences, 0202 electrical engineering, electronic engineering, information engineering
Citation
WoS Q
Q2
Scopus Q
Q1

OpenCitations Citation Count
17
Source
Neural Computıng & Applıcatıons
Volume
33
Issue
13
Start Page
7997
End Page
8015
PlumX Metrics
Citations
CrossRef : 7
Scopus : 18
Captures
Mendeley Readers : 19
Google Scholar™


