Episode Synopsis "2020 AI Research Highlights: Optimization for Deep Learning (part 3)"
In this post, I will focus on the new optimization methods we proposed in 2020. Simple gradient-based methods such as SGD and Adam remain the “workhorses” for training standard neural networks. However, we find many instances where more sophisticated and principled approaches beat these baselines and show promising results. You can read previous posts for … Continue reading 2020 AI Research Highlights: Optimization for Deep Learning (part 3) → The post 2020 AI Research Highlights: Optimization for Deep Learning (part 3) first appeared on Anima on AI.
Listen "2020 AI Research Highlights: Optimization for Deep Learning (part 3)"
More episodes of the podcast Anima on AI
- Top-10 Things in 2022
- Top-10 AI Research Highlights of 2021
- 2020 AI Research Highlights: Learning Frameworks (part 7)
- 2020 AI Research Highlights: Learning and Control (part 6)
- 2020 AI Research Highlights: Controllable Generation (part 5)
- 2020 AI Research Highlights: AI4Science (part 4)
- 2020 AI Research Highlights: Optimization for Deep Learning (part 3)
- 2020 AI Research Highlights: Handling distributional shifts (part 2)
- 2020 AI Research Highlights: Generalizable AI (part 1)
- My heartfelt apology