Make Stochastic Gradient Descent Fast Again (Ep. 113)

22/07/2020 20 min Episodio 110
Make Stochastic Gradient Descent Fast Again (Ep. 113)

Listen "Make Stochastic Gradient Descent Fast Again (Ep. 113)"

Episode Synopsis

There is definitely room for improvement in the family of algorithms of stochastic gradient descent. In this episode I explain a relatively simple method that has shown to improve on the Adam optimizer. But, watch out! This approach does not generalize well.
Join our Discord channel and chat with us.
 
References
More descent, less gradient
Taylor Series