A Review of Weight Optimization Techniques in Recurrent Neural Networks

Alqushaibi, A. and Abdulkadir, S.J. and Rais, H.M. and Al-Tashi, Q. (2020) A Review of Weight Optimization Techniques in Recurrent Neural Networks. In: UNSPECIFIED.

Full text not available from this repository.
Official URL: https://www.scopus.com/inward/record.uri?eid=2-s2....

Abstract

Recurrent neural network (RNN) has gained much attention from researchers working in the domain of time series data processing and proved to be an ideal choice for processing such data. As a result, several studies have been conducted on analyzing the time series data and data processing through a variety of RNN techniques. However, every type of RNN has its own flaws. Simple Recurrent Neural Networks (SRNN) are computationally less complex than other types of RNN such as Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU). However, SRNN has some drawbacks such as vanishing gradient problem that makes it difficult to train when dealing with long term dependencies. The vanishing gradient exists during the training process of SRNN due to the multiplication of the gradient with small value when using the most traditional optimization algorithm the Gradient Decent (GD). Therefore, researches intend to overcome such limitations by utilizing weight optimized techniques such as metaheuristic algorithms. The objective of this paper is to present an extensive review of the challenges and issues of RNN weight optimization techniques and critically analyses the existing proposed techniques. The authors believed that the conducted review would serve as a main source of the techniques and methods used to resolve the problem of RNN time series data and data processing. Furthermore, current challenges and issues are deliberated to find promising research domains for further study. © 2020 IEEE.

Item Type: Conference or Workshop Item (UNSPECIFIED)
Impact Factor: cited By 2
Uncontrolled Keywords: Data handling; Intelligent computing; Optimization; Time series, Long-term dependencies; Meta heuristic algorithm; Optimization algorithms; Recurrent neural network (RNN); Simple recurrent neural networks; Training process; Vanishing gradient; Weight optimization, Long short-term memory
Depositing User: Ms Sharifah Fahimah Saiyed Yeop
Date Deposited: 25 Mar 2022 03:05
Last Modified: 25 Mar 2022 03:05
URI: http://scholars.utp.edu.my/id/eprint/29889

Actions (login required)

View Item
View Item