学习目标
-
Remember different optimization methods such as (Stochastic) Gradient Descent, Momentum, RMSProp and Adam
-
Use random minibatches to accelerate the convergence and improve the optimization
-
Know the benefits of learning rate decay and apply it to your optimization