(Old) Lecture 6 | Acceleration, Regularization, and Normalization

Carnegie Mellon University
Course: 11-785, Intro to Deep Learning
Offering: Spring 2019
Slides: http://deeplearning.cs.cmu.edu/slides.spring19/lecture_6_SGD.pdf

For more information, please visit: http://deeplearning.cs.cmu.edu/

• Stochastic gradient descent
• Optimization
• Acceleration
Overfitting and regularization
• Tricks of the trade:
– Choosing a divergence (loss) function
– Batch normalization
– Dropout

YouTube Source for this AI Video

AI video(s) you might be interested in …