Logo

Parallel and Separable Recursive Levenberg- Marquardt Training Algorithm

Asirvadam , Vijanth Sagayan and McLoone, Sean and Irwin, George (2002) Parallel and Separable Recursive Levenberg- Marquardt Training Algorithm. In: NEURAL NETWORKS For SIGNAL PROCESSING XII. NEURAL NETWORKS For SIGNAL PROCESSING (XII). IEEE Press, Piscataway, New Jersey, pp. 129-138. ISBN 0-7803-76 16- I

Full text not available from this repository.

Official URL: http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumb...

Abstract

A novel decomposed recursive Levenberg Marquardt (RLM) algorithm is derived for the training of feedforward neural networks. By neglecting interneuron weight correlations the recently proposed RLM training algorithm can be decomposed at neuron level enabling weights to be updated in an efficient parallel manner. A separable least squares implementation of decomposed RLM is also introduced. Experiment results for two nonlinear time series problems demonstrate the superiority of the new training algorithms.

Item Type:Book Section
Subjects:T Technology > TK Electrical engineering. Electronics Nuclear engineering
Departments / MOR / COE:Departments > Electrical & Electronic Engineering
ID Code:3827
Deposited By: Dr Vijanth Sagayan Asirvadam
Deposited On:04 Jan 2011 00:41
Last Modified:04 Jan 2011 00:41

Repository Staff Only: item control page