Hardware Implementation of Feed forward Multilayer Neural Network Using the RFNNA Design Methodology

Sun, Ivan Teh Fu and Zain Ali , Noohul Basheer and Hussin, Fawnizu Azmadi (2004) Hardware Implementation of Feed forward Multilayer Neural Network Using the RFNNA Design Methodology. Platform, 4 (2). pp. 68-73. ISSN 1511-6794

[thumbnail of Platform v4n2.pdf.pdf] PDF
Platform v4n2.pdf.pdf - Published Version

Download (1MB)

Abstract

This paper proposes a novel hardware architecture for neural network that shall be named Reconfigurable Feedforward Neural Network Architecture (RFNNA) processor [1]. This neural network architecture aims to minimize the logic circuit as required by a fully parallel implementation. The Field-Programmable Gate Array (FPGA)-based RFNNA processor architecture proposed in this paper shared logic circuits for its hidden layer neurons and could be reconfigured for specific application [2,3], which required different neural network structures. This was achieved by storing connection and neuron weights for the multiple hidden layers in the EPROMs and utilized the hidden layer neuron’s logic circuits iteratively for multiplication, summation and evaluation purposes. In this paper, training of neural network was not considered and was performed offline using software. The resulting weights and biases were then loaded into the RFNNA processor’s EPROMs for implementation [1]. The RFNNA processor was tested with the XOR non-linear problem using a 2-3-3-3-1 architecture.

Item Type: Article
Departments / MOR / COE: Centre of Excellence > Center for Intelligent Signal and Imaging Research
Depositing User: Dr Fawnizu Azmadi Hussin
Date Deposited: 07 Oct 2016 01:42
Last Modified: 07 Oct 2016 01:42
URI: http://scholars.utp.edu.my/id/eprint/11948

Actions (login required)

View Item
View Item