Progressive Operational Perceptrons

Loading...
Publication Logo

Date

2017

Journal Title

Journal ISSN

Volume Title

Publisher

Elsevier

Open Access Color

Green Open Access

Yes

OpenAIRE Downloads

OpenAIRE Views

Publicly Funded

No
Impulse
Top 10%
Influence
Top 10%
Popularity
Top 10%

Research Projects

Journal Issue

Abstract

There are well-known limitations and drawbacks on the performance and robustness of the feed-forward, fully connected Artificial Neural Networks (ANNs), or the so-called Multi-Layer Perceptrons (MLPs). In this study we shall address them by Generalized Operational Perceptrons (GOPs) that consist of neurons with distinct (non-) linear operators to achieve a generalized model of the biological neurons and ultimately a superior diversity. We modified the conventional back-propagation (BP) to train GOPs and furthermore, proposed Progressive Operational Perceptrons (POPs) to achieve self-organized and depth-adaptive GOPs according to the learning problem. The most crucial property of the POPs is their ability to simultaneously search for the optimal operator set and train each layer individually. The final POP is, therefore, formed layer by layer and in this paper we shall show that this ability enables POPs with minimal network depth to attack the most challenging learning problems that cannot be learned by conventional ANNs even with a deeper and significantly complex configuration. Experimental results show that POPs can scale up very well with the problem size and can have the potential to achieve a superior generalization performance on real benchmark problems with a significant gain.

Description

Keywords

Artificial neural networks, Multi-layer perceptrons, Progressive operational perceptrons, Diversity, Scalability, Network, Optimal operators, Complex networks, Multi-layer perceptrons (MLPs), Complex configuration, Backpropagation, Article, mathematical analysis, back propagation, learning disorder, perceptron, Generalized models, linear system, Mathematical operators, mathematical computing, Bench-mark problems, Generalization performance, Diversity, generalized operational perceptron, statistical model, Scalability, mathematical parameters, Benchmarking, priority journal, progressive operational perceptron, nerve cell, Cybernetics, Neural networks, artificial neural network, Multi-layer perceptrons

Fields of Science

02 engineering and technology, 0202 electrical engineering, electronic engineering, information engineering

Citation

WoS Q

Q1

Scopus Q

Q1
OpenCitations Logo
OpenCitations Citation Count
37

Source

Neurocomputıng

Volume

224

Issue

Start Page

142

End Page

154
PlumX Metrics
Citations

CrossRef : 7

Scopus : 45

Captures

Mendeley Readers : 38

Google Scholar Logo
Google Scholar™
OpenAlex Logo
OpenAlex FWCI
3.991

Sustainable Development Goals

SDG data is not available