A Performance Evaluation of Variations

to the Standard Back-propagation Algorithm

A number of techniques have been proposed recently, which attempt to improve the generalization capabilities of Back-propagation neural networks (BPNNs). Among them, weight-decay, cross-validation, and weight-smoothing are probably the most simple and the most frequently used. This paper presents an empirical performance comparison among the above approaches using two real world databases . In addition, in order to further improve generalization, a combination of all the above approaches has been considered and tested. Experimental results illustrate that the coupling of all the three approaches together, significantly outperforms each other individual approach.

George Bebis
November 22, 1995 at 5:22 PM