2020 AI Research Highlights: Optimization for Deep Learning (part 3)

Anima on AI

31/12/2020 10:38PM

Episode Synopsis "2020 AI Research Highlights: Optimization for Deep Learning (part 3)"

In this post, I will focus on the new optimization methods we proposed in 2020.  Simple gradient-based methods such as SGD and Adam remain the “workhorses” for training standard neural networks. However, we find many instances where more sophisticated and principled approaches beat these baselines and show promising results.  You can read previous posts for … Continue reading 2020 AI Research Highlights: Optimization for Deep Learning (part 3) → The post 2020 AI Research Highlights: Optimization for Deep Learning (part 3) first appeared on Anima on AI.

Listen "2020 AI Research Highlights: Optimization for Deep Learning (part 3)"

More episodes of the podcast Anima on AI