AD as it relates to Differentiable Programming for ML @ TWiML Online Meetup Americas 20 March 2019


This video is a recap of our March 2019 Americas TWiML Online Meetup: Automatic Differentiation as it relates to Differentiable Programming for Machine Learning.

In this month’s community segment, we discuss our upcoming April Meetups, NVIDIA’s Jetson Nano Platform, NVIDIA’s Cloud Strategy, attention in NLP, and Sam’s Kubernetes eBook.

In our presentation segment, Rongmin Lu leads a discussion on automatic differentiation (AD) and differentiable programing for machine learning. There has been a flurry of papers in 2018 about how to do reverse AD at the compiler level, and he gives an overview of the papers: Demystifying Differentiable Programming: Shift/Reset the Penultimate Backpropagator by Fei Wang et al., The simple essence of automatic differentiation by Conal Elliott, and Don’t Unroll Adjoint: Differentiating SSA-Form Programs by Michael Innes.

For links to the papers, podcasts, and more mentioned above or during this meetup, for more information on previous meetups, or to get registered for upcoming meetups, visit!

iTunes ➙…
Spotify ➙…
Soundcloud ➙
Google Play ➙
Stitcher ➙…
Subscribe to our newsletter! ➙

Lets Connect! ➙
Twitter ➙
Facebook ➙
Medium ➙…


YouTube Source for this AI Video

AI video(s) you might be interested in …