Rate the Article: Adaptation to Best Fit Learning Rate in Batch Gradient Descent, IJSR, Call for Papers, Online Journal
International Journal of Science and Research (IJSR)

International Journal of Science and Research (IJSR)
Call for Papers | Fully Refereed | Open Access | Double Blind Peer Reviewed

ISSN: 2319-7064

Downloads: 121 | Views: 502 | Weekly Hits: ⮙1 | Monthly Hits: ⮙1

Research Paper | Computer Science & Engineering | India | Volume 3 Issue 8, August 2014 | Rating: 6.7 / 10


Adaptation to Best Fit Learning Rate in Batch Gradient Descent

Mohan Pandey, B. L. Pal


Abstract: The efficient training of a supervised machine learning system is commonly viewed as minimization of a cost function that depends on the parameters of the proposed hypothesis. This perspective gives some advantages to the development of better training algorithm, as the problem of minimization of a function is well known in many fields. Typically many learning system use gradient descent to minimize the cost function but has a problem of slow convergence and exploding on choosing learning rate beyond a threshold. We often make mistake on choosing learning rate as there is no known generic way of choosing learning rate and we need to do it manually. In this paper a method for the adaptation of learning rate is presented and also solving the problem of slow convergence and exploding of the algorithm. The main feature of proposed algorithm is that it speeds up the convergence rate and does not affect the result or number of epochs requires to converge on changing the learning rate.


Keywords: Batch training, On-line training, Epoch, Learning Rate, Gradient Descent, Hypothesis, Cost Function


Edition: Volume 3 Issue 8, August 2014,


Pages: 16 - 20



Rate this Article


Select Rating (Lowest: 1, Highest: 10)

5

Your Comments (Only high quality comments will be accepted.)

Characters: 0

Your Full Name:


Your Valid Email Address:


Verification Code will appear in 2 Seconds ... Wait

Top