Two Minute Papers: What is Optimization? + Learning Gradient Descent | Two Minute Papers #82
Let’s talk about what mathematical optimization is, how gradient descent can solve simpler optimization problems, and Google DeepMind’s proposed algorithm that automatically learn optimization algorithms.
The paper “Learning to learn by gradient descent
by gradient descent” is available here:
http://arxiv.org/pdf/1606.04474v1.pdf
Source code:
https://github.com/deepmind/learning-to-learn
______________________________
Recommended for you:
Gradients, Poisson’s Equation and Light Transport – https://www.youtube.com/watch?v=sSnDTPjfBYU
WE WOULD LIKE TO THANK OUR GENEROUS PATREON SUPPORTERS WHO MAKE TWO MINUTE PAPERS POSSIBLE:
David Jaenisch, Sunil Kim, Julian Josephs, Daniel John Benton.
https://www.patreon.com/TwoMinutePapers
We also thank Experiment for sponsoring our series. – https://experiment.com/
Subscribe if you would like to see more of these! – http://www.youtube.com/subscription_center?add_user=keeroyz
The chihuahua vs muffin image is a courtesy of teenybiscuit – https://twitter.com/teenybiscuit
More fun stuff here: http://twistedsifter.com/2016/03/puppy-or-bagel-meme-gallery/
The thumbnail background image was created by Alan Levine – https://flic.kr/p/vbEd1W
Splash screen/thumbnail design: Felícia Fehér – http://felicia.hu
Károly Zsolnai-Fehér’s links:
Facebook → https://www.facebook.com/TwoMinutePapers/
Twitter → https://twitter.com/karoly_zsolnai
Web → https://cg.tuwien.ac.at/~zsolnai/