PUBLICATIONS

Abstract

Speed-up of error backpropagation algorithm with class-selective relevance.


Kim I, Chien SI

Neurocomputing. 2002 Oct 1;48(1):1009-14. doi: http://dx.doi.org/10.1016/S0925-2312(02)00594-5

Abstract:

Selective attention learning is proposed to improve the speed of the error backpropagation algorithm for fast speaker adaptation. Class-selective relevance for measuring the importance of a hidden node in a multilayer Perceptron is employed to selectively update the weights of the network, thereby reducing the computational cost for learning. Keywords: Class-selective relevance;Error backpropagation algorithm;Fast speaker adaptation


Kim I, Chien SI Speed-up of error backpropagation algorithm with class-selective relevance. 
Neurocomputing. 2002 Oct 1;48(1):1009-14. doi: http://dx.doi.org/10.1016/S0925-2312(02)00594-5

PDF