Evaluation of Global and Local Training Techniques Over Feed-Forward Neural Network Architecture Spaces for Computer-Aided Medical Diagnosis

Loading...
Publication Logo

Date

2010

Journal Title

Journal ISSN

Volume Title

Publisher

Pergamon-Elsevier Science Ltd

Open Access Color

Green Open Access

No

OpenAIRE Downloads

OpenAIRE Views

Publicly Funded

No
Impulse
Top 10%
Influence
Top 10%
Popularity
Top 10%

Research Projects

Journal Issue

Abstract

In this paper, we investigate the performance of global vs. local techniques applied to the training of neural network classifiers for solving medical diagnosis problems. The presented methodology of the investigation involves systematic and exhaustive evaluation of the classifier performance over a neural network architecture space and with respect to training depth for a particular problem. In this study, the architecture space is defined over feed-forward, fully-connected artificial neural networks (ANNs) which have been widely used in computer-aided decision support systems in medical domain, and for which two popular neural network training methods are explored: conventional backpropagation (BP) and particle swarm optimization (PSO). Both training techniques are compared in terms of classification performance over three medical diagnosis problems (breast cancer, heart disease, and diabetes) from Pro-ben1 benchmark dataset and computational and architectural analysis are performed for an extensive assessment. The results clearly demonstrate that it is not possible to compare and evaluate the performance of the two algorithms over a single network and with a fixed set of training parameters, as most of the earlier work in this field has been carried out, since training and test classification performances vary significantly and depend directly on the network architecture, the training depth and method used and the available dataset. We, therefore, show that an extensive evaluation method such as the one proposed in this paper is basically needed to obtain a reliable and detailed performance assessment, in that, we can conclude that the PSO algorithm has usually a better generalization ability across the architecture space whereas BP can occasionally provide better training and/or test classification performance for some network configurations. Furthermore, we can in general say that the PSO, as a global training algorithm, is capable of achieving minimum test classification errors regardless of the training depth, i.e. shallow or deep, and its average classification performance shows less variations with respect to network architecture. In terms of computational complexity, BP is in general superior to PSO for the entire architecture space used. (C) 2010 Elsevier Ltd. All rights reserved.

Description

Keywords

Artificial neural networks, Backpropagation, Particle swarm optimization, Decision-Making, Artificial neural networks, Particle swarm optimization, Backpropagation, 006

Fields of Science

0502 economics and business, 05 social sciences, 0202 electrical engineering, electronic engineering, information engineering, 02 engineering and technology

Citation

WoS Q

Q1

Scopus Q

Q1
OpenCitations Logo
OpenCitations Citation Count
38

Source

Expert Systems Wıth Applıcatıons

Volume

37

Issue

12

Start Page

8450

End Page

8461
PlumX Metrics
Citations

CrossRef : 15

Scopus : 39

Captures

Mendeley Readers : 49

Google Scholar Logo
Google Scholar™
OpenAlex Logo
OpenAlex FWCI
4.2226

Sustainable Development Goals

3

GOOD HEALTH AND WELL-BEING
GOOD HEALTH AND WELL-BEING Logo