Generalized Model of Biological Neural Networks: Progressive Operational Perceptrons

Loading...
Publication Logo

Date

2017

Journal Title

Journal ISSN

Volume Title

Publisher

Institute of Electrical and Electronics Engineers Inc.

Open Access Color

Green Open Access

Yes

OpenAIRE Downloads

OpenAIRE Views

Publicly Funded

No
Impulse
Average
Influence
Top 10%
Popularity
Top 10%

Research Projects

Journal Issue

Abstract

Traditional Artificial Neural Networks (ANNs) such as Multi-Layer Perceptrons (MLPs) and Radial Basis Functions (RBFs) were designed to simulate biological neural networks; however, they are based only loosely on biology and only provide a crude model. This in turn yields well-known limitations and drawbacks on the performance and robustness. In this paper we shall address them by introducing a novel feed-forward ANN model, Generalized Operational Perceptrons (GOPs) that consist of neurons with distinct (non-)linear operators to achieve a generalized model of the biological neurons and ultimately a superior diversity. We modified the conventional back-propagation (BP) to train GOPs and furthermore, proposed Progressive Operational Perceptrons (POPs) to achieve self-organized and depth-adaptive GOPs according to the learning problem. The most crucial property of the POPs is their ability to simultaneously search for the optimal operator set and train each layer individually. The final POP is, therefore, formed layer by layer and this ability enables POPs with minimal network depth to attack the most challenging learning problems that cannot be learned by conventional ANNs even with a deeper and significantly complex configuration. © 2017 IEEE.

Description

Brain-Mind Institute (BMI);Budapest Semester in Cognitive Science (BSCS);Intel
2017 International Joint Conference on Neural Networks, IJCNN 2017 -- 14 May 2017 through 19 May 2017 -- 128847

Keywords

Backpropagation, Cybernetics, Mathematical operators, Radial basis function networks, Biological neural networks, Biological neuron, Complex configuration, Generalized models, Minimal networks, Multi-layer perceptrons (MLPs), Optimal operators, Radial basis functions, Neural networks

Fields of Science

0301 basic medicine, 03 medical and health sciences, 0202 electrical engineering, electronic engineering, information engineering, 02 engineering and technology

Citation

WoS Q

N/A

Scopus Q

N/A
OpenCitations Logo
OpenCitations Citation Count
14

Source

Proceedings of the International Joint Conference on Neural Networks

Volume

2017-May

Issue

Start Page

2477

End Page

2485
PlumX Metrics
Citations

Scopus : 21

Captures

Mendeley Readers : 21

Google Scholar Logo
Google Scholar™
OpenAlex Logo
OpenAlex FWCI
0.9751

Sustainable Development Goals