Saturday 21 January 2012

Artificial Neural Networks


Most generalized algorithm in ANN is Feed forward neural network, which  is a non-parametric estimation of statistical models for extracting nonlinear relations of the input data.  The training algorithm involves two phases (Rumelhart et al., 1986).
       I.            Forward Phase: The free parameters of the networks are fixed and the input signal is propagated through the network during this phase. It finishes with the computation of an error signal.
ei = di - yi                                                                                 (1)
Where di is the desired response and yi is the actual output produced by the network in response to the input xi.
    II.            Backward Phase:  During this second phase, the error signal ei is propagated through the network in the backward direction, hence the name of the algorithm. It is during this phase that the adjustments applied to the free parameters of the network to minimize the error ei in a statistical sense. The back propagation learning algorithm is simple to implement and computationally efficient.
The set of training examples is split into two parts:
• Estimation subset used for the training of the model
• Validation subset used for evaluating the model performance
In general, the network is finally tuned using the entire set of training examples and then tested on test data (Haykin, 1999).