1647215291_maxresdefault.jpg
129
50:39

#69 DR. THOMAS LUX – Interpolation of Sparse High-Dimensional Data [UNPLUGGED]

Today we are speaking with Dr. Thomas Lux, a research scientist at Meta in Silicon Valley. In some sense, all of supervised machine learning can be framed through the lens of geometry. All training data exists as points in euclidean space, and we want to predict the value of a function at all those points. […]
1638434885_maxresdefault.jpg
0
04:54

CoRL 2020, Spotlight Talk 482: Differentiable Logic Layer for Rule Guided Trajectory Prediction

“**Differentiable Logic Layer for Rule Guided Trajectory Prediction** Xiao Li (MIT)*; Guy Rosman (MIT); Igor Gilitschenski (Massachusetts Institute of Technology); Jonathan DeCastro (Toyota Research Institute); Cristian-Ioan Vasile (Lehigh University); Sertac Karaman (Massachusetts Institute of Technology); Daniela Rus (MIT CSAIL) Publication: http://corlconf.github.io/paper_482/ **Abstract** In this work, we propose a method for integration of temporal logic formulas […]
1636164313_maxresdefault.jpg
4.02K
05:53

Neural Networks Demystified [Part 7: Overfitting, Testing, and Regularization]

We’ve built and trained our neural network, but before we celebrate, we must be sure that our model is representative of the real world. Supporting Code: https://github.com/stephencwelch/Neural-Networks-Demystified Nate Silver’s Book: http://www.amazon.com/Signal-Noise-Many-Predictions-Fail/dp/159420411X/ref=sr_1_1?ie=UTF8&qid=1421442340&sr=8-1&keywords=signal+and+the+noise Caltech Machine Learning Course: https://work.caltech.edu/telecourse.html And the lecture shown: http://youtu.be/Dc0sr0kdBVI?t=56m52s In this series, we will build and train a complete Artificial Neural Network in […]
1636160644_maxresdefault.jpg
0

Neural Networks Demystified [Part 6: Training]

After all that work it’s finally time to train our Neural Network. We’ll use the BFGS numerical optimization algorithm and have a look at the results. Supporting Code: https://github.com/stephencwelch/Neural-Networks-Demystified Yann Lecun’s Efficient BackProp Paper: http://yann.lecun.com/exdb/publis/pdf/lecun-98b.pdf More on BFGS: http://en.wikipedia.org/wiki/Broyden%E2%80%93Fletcher%E2%80%93Goldfarb%E2%80%93Shanno_algorithm In this series, we will build and train a complete Artificial Neural Network in python. New […]
1636156587_maxresdefault.jpg
0

Neural Networks Demystified [Part 5: Numerical Gradient Checking]

When building complex systems like neural networks, checking portions of your work can save hours of headache. Here we’ll check our gradient computations. Supporting code: https://github.com/stephencwelch/Neural-Networks-Demystified Link to excellent Stanford tutorial: http://ufldl.stanford.edu/wiki/index.php/UFLDL_Tutorial In this series, we will build and train a complete Artificial Neural Network in python. New videos every other friday. Part 1: Data […]
1636152945_maxresdefault.jpg
0

Neural Networks Demystified [Part 4: Backpropagation]

Backpropagation as simple as possible, but no simpler. Perhaps the most misunderstood part of neural networks, Backpropagation of errors is the key step that allows ANNs to learn. In this video, I give the derivation and thought processes behind backpropagation using high school level calculus. Supporting Code and Equations: https://github.com/stephencwelch/Neural-Networks-Demystified In this series, we will […]
1636149267_maxresdefault.jpg
5.01K
06:56

Neural Networks Demystified [Part 3: Gradient Descent]

Neural Networks Demystified @stephencwelch Supporting Code: https://github.com/stephencwelch/Neural-Networks-Demystified Link to Yann’s Talk: In this short series, we will build and train a complete Artificial Neural Network in python. New videos every other friday. Part 1: Data + Architecture Part 2: Forward Propagation Part 3: Gradient Descent Part 4: Backpropagation Part 5: Numerical Gradient Checking Part 6: […]
1636145619_maxresdefault.jpg
5.27K
04:28

Neural Networks Demystified [Part 2: Forward Propagation]

Neural Networks Demystified @stephencwelch Supporting Code: https://github.com/stephencwelch/Neural-Networks-Demystified In this short series, we will build and train a complete Artificial Neural Network in python. New videos every other friday. Part 1: Data + Architecture Part 2: Forward Propagation Part 3: Gradient Descent Part 4: Backpropagation Part 5: Numerical Gradient Checking Part 6: Training Part 7: Overfitting, […]
1636141873_maxresdefault.jpg
0

Neural Networks Demystified [Part 1: Data and Architecture]

Neural Networks Demystified Part 1: Data and Architecture @stephencwelch Supporting Code: https://github.com/stephencwelch/Neural-Networks-Demystified In this short series, we will build and train a complete Artificial Neural Network in python. New videos every other friday. Part 1: Data + Architecture Part 2: Forward Propagation Part 3: Gradient Descent Part 4: Backpropagation Part 5: Numerical Gradient Checking Part […]
1636099107_maxresdefault.jpg
4
35:05

Pixels to Concepts with Backpropagation w/ Roland Memisevic – #427

Today we’re joined by Roland Memisevic, return podcast guest and Co-Founder & CEO of Twenty Billion Neurons.  We last spoke to Roland in 2018, and just earlier this year TwentyBN made a sharp pivot to a surprising use case, a companion app called Fitness Ally, an interactive, personalized fitness coach on your phone.  In our […]
1635746484_maxresdefault.jpg
14

Applications of Deep Neural Networks Class Session 5

The fourth class gives an overview of the various backpropagation algorithms that are available for TensorFlow. Jupyter notebooks, data files, and other information can be found at: https://sites.wustl.edu/jeffheaton/t81-558/ Source of this machine learning/AI Video
1635620715_maxresdefault.jpg
18
14:07

6.2: Inside Backpropagation Calculation (Module 6, Part 2)

See inside a backpropagation calculation. Using a Javascript website you can see each step of backpropagation training. This video is part of a course that is taught in a hybrid format at Washington University in St. Louis; however, all the information is online and you can easily follow along. T81-558: Application of Deep Learning, at […]
1635259339_maxresdefault.jpg
11
01:25:15

Lecture 12: Learning in CNNs, transpose Convolution

00:00 Backpropagation 00:28:23 Pooling and Downsampling 00:39:48 Upsampling and 1-D scans 00:54:00 Transposed Convolution 01:05:00 Invariance 01:22:00 What do the filters learn? Receptive Fields YouTube Source for this AI Video
1635189363_maxresdefault.jpg
6
01:24:05

Lecture 13: Recurrent Networks

00:00 Introduction 00:14:55 Finite Response Model 00:23:00 Infinite Response System 00:30:00 NARX Network 00:34:30 Jordan Network 00:35:50 Elman Network 00:43:00 State Space Model 00:51:30 Recurrent Neural Network 00:59:10 Variants 01:04:30 Training the RNN 01:10:50 Backpropagation through RNN YouTube Source for this AI Video
1635162712_hqdefault.jpg
32
05:08

Introduction to Neural Networks for C# (Intro)

Latest version of my neural network class: https://www.youtube.com/watch?v=EQ38k6z2aks&list=PLjy4p-07OYzulelvJ5KVaT2pDlxivl_BN Learn Neural Net Programming: http://www.heatonresearch.com/course/intro-neural-nets-cs Introduction to Neural Networks with C# is a course that introduces the C# programmer to the world of Neural Networks and Artificial Intelligence. Neural network architectures, such as the feedforward, Hopfield, and self-organizing map architectures will be presented. Training techniques, such as […]
1635133164_hqdefault.jpg
37
04:47

Introduction to Neural Networks for Java (intro)

Learn Neural Net Programming: http://www.heatonresearch.com/course/intro-neural-nets-java Introduction to Neural Networks with Java is a course that introduces the Java programmer to the world of Neural Networks and Artificial Intelligence. Neural network architectures, such as the feedforward, Hopfield, and self-organizing map architectures will be presented. Training techniques, such as backpropagation, genetic algorithms and simulated annealing are also […]
1635092504_maxresdefault.jpg
31
18:50

6.1: Backpropagation Introduction for Keras and Tensorflow (Module 6, Part 1)

Overview of backpropagation for Keras and TensorFlow. Code backpropagation in Python. This video is part of a course that is taught in a hybrid format at Washington University in St. Louis; however, all the information is online and you can easily follow along. T81-558: Application of Deep Learning, at Washington University in St. Louis Please […]
1635088796_hqdefault.jpg
11
09:57

Introduction to Neural Networks for C#(Class 4/16, Part 5/5) – backpropagation

Learn Neural Net Programming: http://www.heatonresearch.com/course/intro-neural-nets-cs In class session 4, part 5 we will look at how backpropagation is used with a feedforward neural network. Backpropagation is a form of the delta rule and allows adjustment of the weight matrix. Artificial intelligence online course presented by Jeff Heaton, Heaton Research. Source of this machine learning/AI Video
1634918737_maxresdefault.jpg
109
23:30

Backpropagation, Nesterov Momentum, and ADAM Training (4.4)

The ADAM update rule can provide very efficient training with backpropagation and is often used with Keras. See how to calculate the ADAM update rule. Code for This Video: https://github.com/jeffheaton/t81_558_deep_learning/blob/master/t81_558_class_04_4_backprop.ipynb Course Homepage: https://sites.wustl.edu/jeffheaton/t81-558/ Follow Me/Subscribe: https://www.youtube.com/user/HeatonResearch https://github.com/jeffheaton Tweets by jeffheaton Support Me on Patreon: https://www.patreon.com/jeffheaton Source of this machine learning/AI Video
1634837440_hqdefault.jpg
22
09:54

Introduction to Neural Networks for C#(Class 3/16, Part 5/5)

Learn Neural Net Programming: http://www.heatonresearch.com/course/intro-neural-nets-cs In class session 3, part 5 we will look at Hebb’s rule and the delta rule. Both of these rules can be used for neural network training. The delta rule forms the foundation for backpropagation, which we will learn about later in this course. The delta rule can be used […]
1634728696_maxresdefault.jpg
207
22:44

TensorFlow Tutorial 02 – Tensor Basics – Beginner Course

New Tutorial series about TensorFlow 2! Learn all the basics you need to get started with this deep learning framework! Part 02: Tensor Basics In this part I show you how to use tensors. Tensors are the central object in the TensorFlow library. They are used to represent nd-arrays with GPU support and are designed […]