Artificial Neural Networks

An ANN is a computational structure inspired by the study of biological neural processing. It has some capability for simulating the human brain and learning or recognizing a pattern based on partial (incomplete) information. Although there are many types of ANN topologies (e.g., from a relatively simple perceptron to a very complex recurrent network), by far the most popular network architecture is a multilayer feedforward ANN trained using the back-propagation method. Because it has been proved that only one hidden layer is enough to approximate any continuous function [10], most ANNs used in medical image processing are three-layer networks. In this chapter, we only discuss three-layer feedforward ANNs, but the conclusion also should be applicable to other types of ANNs. Once the topology of an ANN (the number of neurons in each layer) is decided, the ANN needs to be trained either in supervised or unsupervised mode using a set of training samples. In the supervised training the identity of each training sample is known, and in unsupervised training the identity of each sample needs to be decided by the ANN. In a typical three-layer feed-forward ANN as shown in Fig. 1, there is one input layer, where the neurons are represented by a set of features, one hidden layer, and one output layer, where

Copyright © 2000 by Academic Press.

All rights of reproduction in any form reserved.

OlLlpuE Liwr

FIGURE 1 A three-layer feed-forward ANN.

OlLlpuE Liwr

FIGURE 1 A three-layer feed-forward ANN.

testing databases is to stop training by monitoring the ANN's performance on the testing data set. Training should continue for as long as the test performance improves (or the error decreases), and it should stop when the test performance starts decreasing.

When using an ANN, another important decision is the number of hidden neurons in the hidden layer. Usually, the network requires enough hidden neurons to make good separation between different classes (e.g., true-positive and false-positive abnormalities in medical images). However, adding more hidden neurons is equivalent to adding more features. As the number of hidden neurons increases, the ANN begins to separate training data too well (overfitting), and decrease the performance in independent testing due to loss of generalization of the network.

the decision making is performed. The relationship between the input neurons (xi, i = 1,2,..., n) and the output neurons (Yk, k = 1,2,..., N), which are connected by the hidden neurons (k, j = 1,2,..., m), is determined with the equation

where g(z) = 1/(1 + e~z), wkj is the weight from the jth hidden neuron to the kth output neuron, Wji is the weight from the ith input neuron to the jth hidden neuron, din is a bias neuron in the input layer, and 6hid is another bias neuron in the hidden layer. A nonlinear sigmoid function,

0 0

Post a comment