International Journal of Science and Research (IJSR)

International Journal of Science and Research (IJSR)
Call for Papers | Fully Refereed | Open Access | Double Blind Peer Reviewed

ISSN: 2319-7064




Downloads: 150 | Views: 247

Research Paper | Computer Science | China | Volume 9 Issue 4, April 2020 | Rating: 6.8 / 10


Effect of Local Dynamic Learning Rate Adaptation on Mini-Batch Gradient Descent for Training Deep Learning Models with Back Propagation Algorithm

Joyce K. Ndauka | Dr. Zheng Xiao Yan


Abstract: Backpropagation has gained much popularity for training neural network models including Deep learning models. Despite its popularity, ordinary backpropagation suffers from low convergence rate due existence of constant learning rate. Since error surface is not smooth learning rate needs to be dynamically adapted to speed up rate of convergence. Much work has been done showing benefits of employing dynamic learning rate on backpropagation algorithm to speed up the rate of convergence focusing on global learning rate adaption and local adaptive learning rate on batch gradient descent. In this work we tried to observe the effect of local dynamic learning rate adaptation using improved iRprop- algorithm with mini batch gradient descent to improve convergence rate of back propagation. Experiment was conducted in python using cifar 10 dataset. Results shows the proposed algorithm outperform the ordinary back propagation algorithm in terms of speed when batch size.


Keywords: Mini-batch gradient descent, Learning rate, Backpropagation, Deep leaning


Edition: Volume 9 Issue 4, April 2020,


Pages: 339 - 342



How to Download this Article?

Type Your Valid Email Address below to Receive the Article PDF Link


Verification Code will appear in 2 Seconds ... Wait

Top