Operational neural networks

Loading...
Publication Logo

Date

2020

Journal Title

Journal ISSN

Volume Title

Publisher

Springer London Ltd

Open Access Color

HYBRID

Green Open Access

Yes

OpenAIRE Downloads

OpenAIRE Views

Publicly Funded

No
Impulse
Top 1%
Influence
Top 10%
Popularity
Top 1%

Research Projects

Journal Issue

Abstract

Feed-forward, fully connected artificial neural networks or the so-called multi-layer perceptrons are well-known universal approximators. However, their learning performance varies significantly depending on the function or the solution space that they attempt to approximate. This is mainly because of their homogenous configuration based solely on the linear neuron model. Therefore, while they learn very well those problems with a monotonous, relatively simple and linearly separable solution space, they may entirely fail to do so when the solution space is highly nonlinear and complex. Sharing the same linear neuron model with two additional constraints (local connections and weight sharing), this is also true for the conventional convolutional neural networks (CNNs) and it is, therefore, not surprising that in many challenging problems only the deep CNNs with a massive complexity and depth can achieve the required diversity and the learning performance. In order to address this drawback and also to accomplish a more generalized model over the convolutional neurons, this study proposes a novel network model, called operational neural networks (ONNs), which can be heterogeneous and encapsulate neurons with any set of operators to boost diversity and to learn highly complex and multi-modal functions or spaces with minimal network complexity and training data. Finally, the training method to back-propagate the error through the operational layers of ONNs is formulated. Experimental results over highly challenging problems demonstrate the superior learning capabilities of ONNs even with few neurons and hidden layers.

Description

Keywords

Operational neural network, Heterogeneous and nonlinear neural networks, Convolutional neural networks, Neuronal Diversity, FOS: Computer and information sciences, Computer Science - Machine Learning, Artificial intelligence, Computer Science - Artificial Intelligence, Computer Vision and Pattern Recognition (cs.CV), Multilayer neural networks, Complex networks, Nonlinear neural networks, Computer Science - Computer Vision and Pattern Recognition, 610, Multi modal function, Universal approximators, 530, 113, Machine Learning (cs.LG), Learning capabilities, Generalized models, Neurons, Software engineering, Feedforward neural networks, Learning systems, Learning performance, 113 Computer and information sciences, Convolution, Artificial Intelligence (cs.AI), Linearly separable, Convolutional neural networks, Personnel training, Multi-layer perceptrons

Fields of Science

02 engineering and technology, 0202 electrical engineering, electronic engineering, information engineering

Citation

WoS Q

Q2

Scopus Q

Q1
OpenCitations Logo
OpenCitations Citation Count
67

Source

Neural Computıng & Applıcatıons

Volume

32

Issue

11

Start Page

6645

End Page

6668
PlumX Metrics
Citations

CrossRef : 36

Scopus : 89

Captures

Mendeley Readers : 173

Google Scholar Logo
Google Scholar™
OpenAlex Logo
OpenAlex FWCI
0.2743

Sustainable Development Goals