Manjeet Dahiya

Gradient Descent Extensions

Goal of extensions of gradient descents such as Momentum, RMSProp and Adam is to speedup the gradient descent algorithm and address the challenges faced by it. This post presents the challenges and optimizations to handle the same.

Challenges of gradient descent


What does it solve?

RMSProp (Root-mean squared prop)


Momentum and RMSProb both have their limitations. Adam is a more robust optimization; it combines Momentum and RMSProb both.

Learning rate decay

© 2018-19 Manjeet Dahiya