Logo

Hardware Implementation of Feed forward Multilayer Neural Network Using the RFNNA Design Methodology

Sun, Ivan Teh Fu and Zain Ali, Noohul Basheer and Hussin, Fawnizu Azmadi (2004) Hardware Implementation of Feed forward Multilayer Neural Network Using the RFNNA Design Methodology. In: Conference on Neuro-Computing and Evolving Intelligence 2004 (NCEI'04), December 2004, Auckland, New Zeland.

Full text not available from this repository.

Abstract

This paper proposes a novel hardware architecture for neural network that shall be named Reconfigurable Feedforward Neural Network Architecture (RFNNA) processor [1]. This neural network architecture aims to minimize the logic circuit as required by a fully parallel implementation. The Field-Programmable Gate Array (FPGA)-based RFNNA processor architecture proposed in this paper shared logic circuits for its hidden layer neurons and could be reconfigured for specific applications [2,3], which required different neural network structures. This was achieved by storing connection and neuron weights for the multiple hidden layers in the EPROMs and utilized the hidden layer neuron’s logic circuits iteratively for multiplication, summation and evaluation purposes. In this paper, training of neural network was not considered and was performed offline using software. The resulting weights and biases were then loaded into the RFNNA processor’s EPROMs for implementation [1]. The RFNNA processor was tested with the XOR non-linear problem using a 2-3-3-3-1 architecture.

Item Type:Conference or Workshop Item (Paper)
Academic Subject One:Academic Department - Electrical And Electronics - Pervasisve Systems - Digital Electronics - Design
Departments / MOR / COE:Centre of Excellence > Center for Intelligent Signal and Imaging Research
ID Code:12001
Deposited By: Dr Fawnizu Azmadi Hussin
Deposited On:07 Oct 2016 01:42
Last Modified:07 Oct 2016 01:42

Repository Staff Only: item control page