Manjeet Dahiya

Gradient Descent Extensions

Goal of extensions of gradient descents such as Momentum, RMSProp and Adam is to speedup the gradient descent algorithm and address the challenges faced by it. This post presents the challenges and optimizations to handle the same.

Challenges of gradient descent

Momentum

What does it solve?

RMSProp (Root-mean squared prop)

Adam

Momentum and RMSProb both have their limitations. Adam is a more robust optimization; it combines Momentum and RMSProb both.

Learning rate decay


© 2018-19 Manjeet Dahiya