1672805461_maxresdefault.jpg
14
05:49

#20 Machine Learning Specialization [Course 1, Week 1, Lesson 4]

Let’s see what happens when you run gradient descent for linear regression. Let’s go see the arrow in action. Here’s a plot of the model and data on the upper left and a contour plot of the cost function on the upper right. And at the bottom is the surface plot of the same cost […]
1672775716_maxresdefault.jpg
712
24:21

9 Powerful Free AI apps! Image Generation, Music Generation and More!

Hello viewers from around the internet and across the globe. Welcome back to another video So as you guys can see today we’re actually gonna be recording on my iPhone And that’s because you guys actually voted for this video earlier today I uploaded a little poll vote for a video and I gave you […]
1672719491_maxresdefault.jpg
39
10:55

TorchRL: The Reinforcement Learning and Control library for PyTorch

Hello everyone, my name is Vincent and I’m developer for Torcharel, the reinforcement learning and control library for PyTorch. So when we started this effort of developing Torcharel, we had a look at the existing ecosystem of our libraries that are using PyTorch. And what we realized is that some of them were heavily focused […]
1672535003_maxresdefault.jpg
10
45:57

Reinforcement Learning for Personalization at Spotify with Tony Jebara – 609

All right, everyone. Welcome to another episode of the Twilmo AI podcast. I am your host, Sam Charrington. And today I’m joined by Tony Jabara. Tony is a vice president of engineering and head of machine learning at Spotify. Before we get going, be sure to take a moment to hit that subscribe button wherever […]
1672459510_maxresdefault.jpg
5
06:53

#23 Machine Learning Specialization [Course 1, Week 2, Lesson 1]

I remember when I first learned about vectorization, I spent many hours on my computer taking an unvectorized version of an algorithm running it, see how long I ran, and then running a vectorized version of the code and seeing how much faster that ran. And I just spent hours playing with that, and it […]
1672373066_maxresdefault.jpg
9
10:00

#16 Machine Learning Specialization [Course 1, Week 1, Lesson 4]

Let’s take a look at how you can actually implement the gradient descent algorithm. Let me write down the gradient descent algorithm. Here this, on each step, w, the parameter, is updated to the old value of w, minus alpha times this term d over dw of the constant j of wb. So what this […]
1672286461_maxresdefault.jpg
5
05:52

#30 Machine Learning Specialization [Course 1, Week 2, Lesson 2]

So far we’ve just been fitting straight lines to our data. Let’s take the ideas of multiple linear regression and feature engineering to come up with a new algorithm called polynomial regression, which lets you fit curves, non-linear functions, to your data. Let’s say you have a housing data set that looks like this, where […]
1672283629_maxresdefault.jpg
55
13:35

#94 – ALAN CHAN – AI Alignment and Governance #NEURIPS

Alan Chan is a PhD student at Miele, the Montreal Institute for Learning Algorithms, supervised by Nicholas DeRoux. Before joining Miele, Alan was a master’s student at the Alberta Machine Intelligence Institute and the University of Alberta, where he worked with Martha White. Alan’s expertise and research interests encompass value alignment and AI governance. He’s […]
1672122727_maxresdefault.jpg
0
33:32

Research: To be like a bat

Welcome back to Cogax 2021. I’m Christine Foster, CCO at the Allen Turing Institute, and we’re the UK’s Institute for Data Science and Artificial Intelligence. There’s so much happening across the festival, so I’ll mention two of the Turing events. First, Vanessa Lawrence, who’s on the board of the Turing, will join a panel about […]
1672113552_maxresdefault.jpg
2
06:07

#28 Machine Learning Specialization [Course 1, Week 2, Lesson 2]

Your learning algorithm will run much better with an appropriate choice of learning rate. If it’s too small, it will run very slowly, and if it’s too large, it may not even converge. Let’s take a look at how you can choose a good learning rate for your model. Contrarily, if you plot the cost […]
1672113369_maxresdefault.jpg
67
24:46

Top Kaggle Solution for Fall 2022 Semester

Welcome to Applications with Deep Narrow Networks with Washington University. In this video I’m going to show you the Kaggle competition that we just competed. This was a Kaggle competition that I put together just all original data for time series forecasting. And we’re going to look at the presentation given by the winning team. […]
1672027114_maxresdefault.jpg
3
12:00

#34 Machine Learning Specialization [Course 1, Week 3, Lesson 2]

Remember that the cost function gives you a way to measure how well a specific set of parameters fits the training data, and thereby gives you a way to try to choose better parameters. In this video, we’ll look at how the squared error cost function is not an ideal cost function for a logistic […]
1671732483_maxresdefault.jpg
17
46:54

Real-Time ML Workflows at Capital One with Disha Singla – 606

All right, what’s up everyone? Welcome to another episode of the Tumol AI podcast. I am your host, Sam Charrington. And today I’ve got the pleasure of being joined by Disha Singla. Disha is a senior director of machine learning engineering at Capital One. Before we dive into our conversation, be sure to take a […]
1671681078_maxresdefault.jpg
6
06:44

#10 Machine Learning Specialization [Course 1, Week 1, Lesson 3]

Let’s look in this video at the process of how supervised learning works. Supposed learning algorithm would input a dataset and then what exactly does it do and what does it output that’s find out in this video. Recall that a training set in supervised learning includes both the input features such as the size […]
1671594604_maxresdefault.jpg
4
03:40

#7 Machine Learning Specialization [Course 1, Week 1, Lesson 2]

In the last video, you saw what is unsupervised learning and one type of unsupervised learning called clustering. Let’s give a slightly more formal definition of unsupervised learning and take a quick look at some other types of unsupervised learning other than clustering. Whereas in supervised learning, the data comes with both input x and […]
1670409578_maxresdefault.jpg
5
45:58

AI for High-Stakes Decision Making with Hima Lakkaraju – #387

Welcome to the Twimal AI Podcast. I’m your host Sam Charrington. Hey, what is up, good Twimal people? Before we jump into today’s show from our CVPR series, I’d like to share a few quick details about the next great event in our continuing live discussion series. Join us on Wednesday, July 1st for the […]
1670383944_maxresdefault.jpg
40
05:28

#3 Machine Learning Specialization [Course 1, Week 1, Lesson 2]

So, what is machine learning? In this video, you learn a definition of what it is and also get a sense of when you might want to apply it. Let’s take a look together. Here’s a definition of what is machine learning that is attributed to Arthur Samuel. He defined machine learning as the few […]
1669690334_maxresdefault.jpg
7.67K
08:17

AI Just Solved a 53-Year-Old Problem! | AlphaTensor, Explained

I’m gonna show you what I think is the most exciting breakthrough we’ve made this year. I understand that generating images, and text, and code is exciting, but this… All of our tensor… This can change everything. And although all of our tensor is about matrix multiplication, and that’s what everyone is talking about, I […]
1669606543_maxresdefault.jpg
8
26:25

1-785 Deep Learning Recitation 11: Transformers Part 2

All right, welcome to the Transformers Recitation Part 2. This will be in continuation from part 1 where we coded it from scratch. One thing I’d like to note before we begin is I reviewed the rest of the presentation that I’ve recorded before. And I think I spoke very slowly in terms of my […]
1669433368_maxresdefault.jpg
4
01:25:20

11-785, Fall 22 Lecture 22: Variational Auto Encoders (Part 2)

We're looking at neural networks as generative models. We've seen how neural nets can perform classification or regression. Now we want to use them as generative models, which can model the distribution of any data, such that we can draw samples from it.