Logo

Hardware Implementation of Feed forward Multilayer Neural Network Using the RFNNA Design Methodology

Sun, Ivan Teh Fu and Zain Ali , Noohul Basheer and Hussin, Fawnizu Azmadi (2004) Hardware Implementation of Feed forward Multilayer Neural Network Using the RFNNA Design Methodology. Platform, 4 (2). pp. 68-73. ISSN 1511-6794

[img] PDF - Published Version
1499Kb

Abstract

This paper proposes a novel hardware architecture for neural network that shall be named Reconfigurable Feedforward Neural Network Architecture (RFNNA) processor [1]. This neural network architecture aims to minimize the logic circuit as required by a fully parallel implementation. The Field-Programmable Gate Array (FPGA)-based RFNNA processor architecture proposed in this paper shared logic circuits for its hidden layer neurons and could be reconfigured for specific application [2,3], which required different neural network structures. This was achieved by storing connection and neuron weights for the multiple hidden layers in the EPROMs and utilized the hidden layer neuron’s logic circuits iteratively for multiplication, summation and evaluation purposes. In this paper, training of neural network was not considered and was performed offline using software. The resulting weights and biases were then loaded into the RFNNA processor’s EPROMs for implementation [1]. The RFNNA processor was tested with the XOR non-linear problem using a 2-3-3-3-1 architecture.

Item Type:Article
Academic Subject One:Academic Department - Electrical And Electronics - Pervasisve Systems - Digital Electronics - Design
Departments / MOR / COE:Centre of Excellence > Center for Intelligent Signal and Imaging Research
ID Code:11948
Deposited By: Dr Fawnizu Azmadi Hussin
Deposited On:07 Oct 2016 01:42
Last Modified:07 Oct 2016 01:42

Repository Staff Only: item control page

Document Downloads

More statistics for this item...